Mar 21 03:46:06 crc systemd[1]: Starting Kubernetes Kubelet... Mar 21 03:46:06 crc restorecon[4684]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:06 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 21 03:46:07 crc restorecon[4684]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 21 03:46:07 crc restorecon[4684]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 21 03:46:07 crc kubenswrapper[4685]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 21 03:46:07 crc kubenswrapper[4685]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 21 03:46:07 crc kubenswrapper[4685]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 21 03:46:07 crc kubenswrapper[4685]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 21 03:46:07 crc kubenswrapper[4685]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 21 03:46:07 crc kubenswrapper[4685]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 21 03:46:07 crc kubenswrapper[4685]: I0321 03:46:07.993564 4685 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.000897 4685 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.000925 4685 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.000937 4685 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.000948 4685 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.000957 4685 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.000967 4685 feature_gate.go:330] unrecognized feature gate: Example Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.000976 4685 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.000986 4685 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.000995 4685 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.001004 4685 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.001011 4685 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.001019 4685 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.001027 4685 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.001035 4685 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.001050 4685 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.001057 4685 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.001065 4685 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.001073 4685 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.001080 4685 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.001088 4685 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.001096 4685 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.001104 4685 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.001111 4685 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.001119 4685 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.001126 4685 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.001134 4685 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.001142 4685 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.001149 4685 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.001157 4685 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.001165 4685 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.001172 4685 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.001183 4685 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.001194 4685 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.001203 4685 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.001212 4685 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.001232 4685 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.001243 4685 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.001251 4685 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.001259 4685 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.001267 4685 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.001275 4685 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.001283 4685 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.001292 4685 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.001300 4685 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.001308 4685 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.001318 4685 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.001326 4685 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.001334 4685 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.001342 4685 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.001350 4685 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.001359 4685 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.001367 4685 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.001376 4685 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.001384 4685 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.001392 4685 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.001404 4685 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.001414 4685 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.001425 4685 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.001435 4685 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.001443 4685 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.001451 4685 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.001459 4685 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.001466 4685 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.001474 4685 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.001481 4685 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.001489 4685 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.001496 4685 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.001504 4685 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.001512 4685 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.001520 4685 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.001527 4685 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.002369 4685 flags.go:64] FLAG: --address="0.0.0.0" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.002396 4685 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.002410 4685 flags.go:64] FLAG: --anonymous-auth="true" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.002422 4685 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.002440 4685 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.002450 4685 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.002462 4685 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.002473 4685 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.002482 4685 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.002491 4685 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.002502 4685 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.002511 4685 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.002520 4685 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.002530 4685 flags.go:64] FLAG: --cgroup-root="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.002538 4685 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.002548 4685 flags.go:64] FLAG: --client-ca-file="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.002557 4685 flags.go:64] FLAG: --cloud-config="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.002565 4685 flags.go:64] FLAG: --cloud-provider="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.002575 4685 flags.go:64] FLAG: --cluster-dns="[]" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.002587 4685 flags.go:64] FLAG: --cluster-domain="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.002596 4685 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.002606 4685 flags.go:64] FLAG: --config-dir="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.002614 4685 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.002625 4685 flags.go:64] FLAG: --container-log-max-files="5" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.002636 4685 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.002645 4685 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.002654 4685 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.002663 4685 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.002672 4685 flags.go:64] FLAG: --contention-profiling="false" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.002681 4685 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.002690 4685 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.002699 4685 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.002707 4685 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.002719 4685 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.002728 4685 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.002737 4685 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.002746 4685 flags.go:64] FLAG: --enable-load-reader="false" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.002756 4685 flags.go:64] FLAG: --enable-server="true" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.002765 4685 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.002786 4685 flags.go:64] FLAG: --event-burst="100" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.002795 4685 flags.go:64] FLAG: --event-qps="50" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.002804 4685 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.002813 4685 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.002821 4685 flags.go:64] FLAG: --eviction-hard="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.002832 4685 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.002868 4685 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.002877 4685 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.002887 4685 flags.go:64] FLAG: --eviction-soft="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.002895 4685 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.002904 4685 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.002914 4685 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.002923 4685 flags.go:64] FLAG: --experimental-mounter-path="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.002931 4685 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.002940 4685 flags.go:64] FLAG: --fail-swap-on="true" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.002949 4685 flags.go:64] FLAG: --feature-gates="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.002961 4685 flags.go:64] FLAG: --file-check-frequency="20s" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.002970 4685 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.002980 4685 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.002989 4685 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.002997 4685 flags.go:64] FLAG: --healthz-port="10248" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003007 4685 flags.go:64] FLAG: --help="false" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003016 4685 flags.go:64] FLAG: --hostname-override="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003024 4685 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003033 4685 flags.go:64] FLAG: --http-check-frequency="20s" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003042 4685 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003051 4685 flags.go:64] FLAG: --image-credential-provider-config="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003059 4685 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003068 4685 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003077 4685 flags.go:64] FLAG: --image-service-endpoint="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003086 4685 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003095 4685 flags.go:64] FLAG: --kube-api-burst="100" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003104 4685 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003113 4685 flags.go:64] FLAG: --kube-api-qps="50" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003122 4685 flags.go:64] FLAG: --kube-reserved="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003131 4685 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003140 4685 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003149 4685 flags.go:64] FLAG: --kubelet-cgroups="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003158 4685 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003166 4685 flags.go:64] FLAG: --lock-file="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003175 4685 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003183 4685 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003193 4685 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003207 4685 flags.go:64] FLAG: --log-json-split-stream="false" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003216 4685 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003224 4685 flags.go:64] FLAG: --log-text-split-stream="false" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003233 4685 flags.go:64] FLAG: --logging-format="text" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003242 4685 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003251 4685 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003260 4685 flags.go:64] FLAG: --manifest-url="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003269 4685 flags.go:64] FLAG: --manifest-url-header="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003280 4685 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003289 4685 flags.go:64] FLAG: --max-open-files="1000000" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003300 4685 flags.go:64] FLAG: --max-pods="110" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003309 4685 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003318 4685 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003326 4685 flags.go:64] FLAG: --memory-manager-policy="None" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003334 4685 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003345 4685 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003354 4685 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003363 4685 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003383 4685 flags.go:64] FLAG: --node-status-max-images="50" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003392 4685 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003401 4685 flags.go:64] FLAG: --oom-score-adj="-999" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003410 4685 flags.go:64] FLAG: --pod-cidr="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003418 4685 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003432 4685 flags.go:64] FLAG: --pod-manifest-path="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003441 4685 flags.go:64] FLAG: --pod-max-pids="-1" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003450 4685 flags.go:64] FLAG: --pods-per-core="0" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003459 4685 flags.go:64] FLAG: --port="10250" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003469 4685 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003478 4685 flags.go:64] FLAG: --provider-id="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003486 4685 flags.go:64] FLAG: --qos-reserved="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003495 4685 flags.go:64] FLAG: --read-only-port="10255" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003504 4685 flags.go:64] FLAG: --register-node="true" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003512 4685 flags.go:64] FLAG: --register-schedulable="true" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003526 4685 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003541 4685 flags.go:64] FLAG: --registry-burst="10" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003551 4685 flags.go:64] FLAG: --registry-qps="5" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003559 4685 flags.go:64] FLAG: --reserved-cpus="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003568 4685 flags.go:64] FLAG: --reserved-memory="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003578 4685 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003587 4685 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003598 4685 flags.go:64] FLAG: --rotate-certificates="false" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003607 4685 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003616 4685 flags.go:64] FLAG: --runonce="false" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003625 4685 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003633 4685 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003643 4685 flags.go:64] FLAG: --seccomp-default="false" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003652 4685 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003660 4685 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003670 4685 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003679 4685 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003688 4685 flags.go:64] FLAG: --storage-driver-password="root" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003697 4685 flags.go:64] FLAG: --storage-driver-secure="false" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003706 4685 flags.go:64] FLAG: --storage-driver-table="stats" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003714 4685 flags.go:64] FLAG: --storage-driver-user="root" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003723 4685 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003732 4685 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003741 4685 flags.go:64] FLAG: --system-cgroups="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003750 4685 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003763 4685 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003772 4685 flags.go:64] FLAG: --tls-cert-file="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003780 4685 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003791 4685 flags.go:64] FLAG: --tls-min-version="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003800 4685 flags.go:64] FLAG: --tls-private-key-file="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003809 4685 flags.go:64] FLAG: --topology-manager-policy="none" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003818 4685 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003857 4685 flags.go:64] FLAG: --topology-manager-scope="container" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003867 4685 flags.go:64] FLAG: --v="2" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003879 4685 flags.go:64] FLAG: --version="false" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003890 4685 flags.go:64] FLAG: --vmodule="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003900 4685 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.003910 4685 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.004127 4685 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.004138 4685 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.004147 4685 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.004156 4685 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.004165 4685 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.004173 4685 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.004181 4685 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.004189 4685 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.004196 4685 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.004204 4685 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.004212 4685 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.004220 4685 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.004228 4685 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.004238 4685 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.004248 4685 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.004258 4685 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.004266 4685 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.004274 4685 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.004283 4685 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.004291 4685 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.004298 4685 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.004306 4685 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.004314 4685 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.004322 4685 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.004330 4685 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.004337 4685 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.004351 4685 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.004360 4685 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.004369 4685 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.004377 4685 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.004386 4685 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.004394 4685 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.004403 4685 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.004410 4685 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.004418 4685 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.004426 4685 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.004433 4685 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.004441 4685 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.004449 4685 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.004457 4685 feature_gate.go:330] unrecognized feature gate: Example Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.004465 4685 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.004472 4685 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.004480 4685 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.004490 4685 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.004500 4685 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.004509 4685 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.004517 4685 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.004526 4685 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.006074 4685 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.006175 4685 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.006190 4685 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.006202 4685 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.006214 4685 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.006226 4685 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.006237 4685 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.006249 4685 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.006261 4685 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.006272 4685 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.006284 4685 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.006300 4685 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.006330 4685 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.006341 4685 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.006350 4685 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.006361 4685 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.007167 4685 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.007738 4685 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.007767 4685 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.007776 4685 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.007783 4685 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.007795 4685 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.007807 4685 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.007823 4685 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.022067 4685 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.022117 4685 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.022283 4685 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.022303 4685 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.022315 4685 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.022328 4685 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.022340 4685 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.022353 4685 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.022366 4685 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.022379 4685 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.022423 4685 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.022436 4685 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.022448 4685 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.022460 4685 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.022472 4685 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.022483 4685 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.022496 4685 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.022511 4685 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.022524 4685 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.022535 4685 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.022547 4685 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.022557 4685 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.022567 4685 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.022578 4685 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.022588 4685 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.022600 4685 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.022610 4685 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.022621 4685 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.022631 4685 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.022645 4685 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.022657 4685 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.022669 4685 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.022680 4685 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.022692 4685 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.022704 4685 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.022717 4685 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.022728 4685 feature_gate.go:330] unrecognized feature gate: Example Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.022737 4685 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.022748 4685 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.022758 4685 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.022769 4685 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.022780 4685 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.022792 4685 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.022802 4685 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.022813 4685 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.022823 4685 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.022832 4685 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.022904 4685 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.022915 4685 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.022925 4685 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.022935 4685 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.022946 4685 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.022955 4685 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.022965 4685 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.022975 4685 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.022986 4685 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.022996 4685 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.023006 4685 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.023016 4685 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.023026 4685 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.023037 4685 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.023046 4685 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.023056 4685 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.023066 4685 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.023080 4685 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.023092 4685 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.023103 4685 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.023114 4685 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.023125 4685 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.023135 4685 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.023145 4685 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.023155 4685 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.023167 4685 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.023185 4685 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.023574 4685 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.023599 4685 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.023612 4685 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.023624 4685 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.023635 4685 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.023646 4685 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.023658 4685 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.023669 4685 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.023681 4685 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.023692 4685 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.023703 4685 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.023717 4685 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.023732 4685 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.023744 4685 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.023755 4685 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.023767 4685 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.023777 4685 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.023788 4685 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.023797 4685 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.023807 4685 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.023817 4685 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.023829 4685 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.023877 4685 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.023889 4685 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.023902 4685 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.023915 4685 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.023926 4685 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.023937 4685 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.023947 4685 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.023957 4685 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.023967 4685 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.023977 4685 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.023988 4685 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.023998 4685 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.024007 4685 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.024020 4685 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.024031 4685 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.024041 4685 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.024051 4685 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.024061 4685 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.024072 4685 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.024082 4685 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.024097 4685 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.024110 4685 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.024121 4685 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.024132 4685 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.024144 4685 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.024154 4685 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.024166 4685 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.024178 4685 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.024188 4685 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.024199 4685 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.024210 4685 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.024221 4685 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.024231 4685 feature_gate.go:330] unrecognized feature gate: Example Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.024243 4685 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.024253 4685 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.024264 4685 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.024276 4685 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.024288 4685 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.024298 4685 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.024309 4685 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.024320 4685 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.024332 4685 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.024343 4685 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.024353 4685 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.024364 4685 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.024374 4685 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.024384 4685 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.024395 4685 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.024405 4685 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.024426 4685 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.025890 4685 server.go:940] "Client rotation is on, will bootstrap in background" Mar 21 03:46:08 crc kubenswrapper[4685]: E0321 03:46:08.033228 4685 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.038360 4685 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.038536 4685 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.042122 4685 server.go:997] "Starting client certificate rotation" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.042178 4685 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.042376 4685 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.069289 4685 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.073049 4685 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 21 03:46:08 crc kubenswrapper[4685]: E0321 03:46:08.073162 4685 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.158:6443: connect: connection refused" logger="UnhandledError" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.096123 4685 log.go:25] "Validated CRI v1 runtime API" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.139689 4685 log.go:25] "Validated CRI v1 image API" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.142704 4685 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.150292 4685 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-21-03-42-29-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.150364 4685 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:43 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:42 fsType:tmpfs blockSize:0}] Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.181898 4685 manager.go:217] Machine: {Timestamp:2026-03-21 03:46:08.178097775 +0000 UTC m=+0.655166637 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:9e08e2bc-a5ba-4ecf-a5ec-5e30e6b4a1fa BootID:8bd8455e-cd7f-4a01-9ab2-39696fd22c82 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:43 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:42 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:a0:3b:e0 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:a0:3b:e0 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:46:91:b0 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:c5:d5:c8 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:33:af:97 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:12:8c:f2 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:72:17:53:74:63:c0 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:be:ad:bd:03:0b:2c Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.182402 4685 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.182630 4685 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.183120 4685 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.183430 4685 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.183495 4685 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.183924 4685 topology_manager.go:138] "Creating topology manager with none policy" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.183943 4685 container_manager_linux.go:303] "Creating device plugin manager" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.184549 4685 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.184603 4685 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.185464 4685 state_mem.go:36] "Initialized new in-memory state store" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.186133 4685 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.193198 4685 kubelet.go:418] "Attempting to sync node with API server" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.193267 4685 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.193318 4685 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.193342 4685 kubelet.go:324] "Adding apiserver pod source" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.193378 4685 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.199101 4685 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.200289 4685 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.201158 4685 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Mar 21 03:46:08 crc kubenswrapper[4685]: E0321 03:46:08.201270 4685 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.158:6443: connect: connection refused" logger="UnhandledError" Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.201184 4685 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Mar 21 03:46:08 crc kubenswrapper[4685]: E0321 03:46:08.201380 4685 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.158:6443: connect: connection refused" logger="UnhandledError" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.204302 4685 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.206067 4685 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.206114 4685 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.206130 4685 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.206145 4685 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.206167 4685 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.206181 4685 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.206194 4685 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.206217 4685 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.206234 4685 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.206248 4685 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.206275 4685 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.206290 4685 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.207323 4685 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.208124 4685 server.go:1280] "Started kubelet" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.208296 4685 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.209281 4685 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.209788 4685 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.210310 4685 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 21 03:46:08 crc systemd[1]: Started Kubernetes Kubelet. Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.212003 4685 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.212052 4685 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 21 03:46:08 crc kubenswrapper[4685]: E0321 03:46:08.212759 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.213055 4685 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.213097 4685 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.213109 4685 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.222465 4685 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Mar 21 03:46:08 crc kubenswrapper[4685]: E0321 03:46:08.222584 4685 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.158:6443: connect: connection refused" logger="UnhandledError" Mar 21 03:46:08 crc kubenswrapper[4685]: E0321 03:46:08.222951 4685 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" interval="200ms" Mar 21 03:46:08 crc kubenswrapper[4685]: E0321 03:46:08.224255 4685 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.158:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189ebe81dd94074f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:08.208078671 +0000 UTC m=+0.685147493,LastTimestamp:2026-03-21 03:46:08.208078671 +0000 UTC m=+0.685147493,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.229376 4685 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.229415 4685 factory.go:55] Registering systemd factory Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.229431 4685 factory.go:221] Registration of the systemd container factory successfully Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.229606 4685 server.go:460] "Adding debug handlers to kubelet server" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.230776 4685 factory.go:153] Registering CRI-O factory Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.230827 4685 factory.go:221] Registration of the crio container factory successfully Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.230906 4685 factory.go:103] Registering Raw factory Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.230938 4685 manager.go:1196] Started watching for new ooms in manager Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.232939 4685 manager.go:319] Starting recovery of all containers Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.238428 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.238517 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.238542 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.238564 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.238584 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.238604 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.238625 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.238644 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.238666 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.238686 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.238706 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.238724 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.238743 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.238765 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.238782 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.238801 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.238823 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.238871 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.238895 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.238917 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.238936 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.238956 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.238976 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.238995 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.239013 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.239034 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.239060 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.239128 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.239148 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.239169 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.239190 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.239210 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.239233 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.239253 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.239273 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.239293 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.239314 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.239335 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.239355 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.239376 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.239397 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.239421 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.239447 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.239468 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.239488 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.239510 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.239530 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.239552 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.239579 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.239605 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.239627 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.239648 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.239672 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.239694 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.239717 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.239737 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.239759 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.239781 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.239800 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.239818 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.239864 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.239885 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.239907 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.239927 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.239949 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.239971 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.239992 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.240010 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.240033 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.240053 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.240075 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.240095 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.240114 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.240135 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.240154 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.240172 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.240216 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.240235 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.240256 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.240276 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.240297 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.240316 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.240337 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.240356 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.240374 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.240392 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.240414 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.240435 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.240465 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.240485 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.240503 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.240521 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.240540 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.240558 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.240576 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.240595 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.240625 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.240646 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.240667 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.240686 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.240707 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.240725 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.240744 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.240764 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.240949 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.240978 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.240999 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.241019 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.241041 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.241060 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.241080 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.241105 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.241126 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.241151 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.241172 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.241191 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.241212 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.241232 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.241251 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.241296 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.241316 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.241334 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.241353 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.241372 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.241392 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.241411 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.241432 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.241450 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.241470 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.241489 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.241507 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.241525 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.241543 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.241562 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.241579 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.241599 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.241618 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.241638 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.241656 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.241678 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.241699 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.241718 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.241736 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.241754 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.241778 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.241802 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.241819 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.241938 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.241958 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.241977 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.241994 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.242012 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.242030 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.242050 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.242069 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.242087 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.242105 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.242123 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.242142 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.242160 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.242181 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.242201 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.242218 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.242239 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.242257 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.242276 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.242295 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.242315 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.242335 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.242354 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.242372 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.242391 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.242409 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.242427 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.242445 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.242464 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.242482 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.242506 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.242523 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.242541 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.242560 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.242578 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.242596 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.242616 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.242635 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.242654 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.242674 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.242692 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.242710 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.242729 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.242749 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.242767 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.242787 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.242807 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.242827 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.242869 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.248791 4685 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.248875 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.248910 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.248936 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.248967 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.248988 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.249009 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.249032 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.249052 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.249074 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.249094 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.249117 4685 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.249138 4685 reconstruct.go:97] "Volume reconstruction finished" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.249152 4685 reconciler.go:26] "Reconciler: start to sync state" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.270330 4685 manager.go:324] Recovery completed Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.289783 4685 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.291568 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.291634 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.291648 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.293358 4685 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.293379 4685 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.293403 4685 state_mem.go:36] "Initialized new in-memory state store" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.294409 4685 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.299594 4685 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.299638 4685 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.299678 4685 kubelet.go:2335] "Starting kubelet main sync loop" Mar 21 03:46:08 crc kubenswrapper[4685]: E0321 03:46:08.299743 4685 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.301943 4685 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Mar 21 03:46:08 crc kubenswrapper[4685]: E0321 03:46:08.301999 4685 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.158:6443: connect: connection refused" logger="UnhandledError" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.311281 4685 policy_none.go:49] "None policy: Start" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.312049 4685 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.312076 4685 state_mem.go:35] "Initializing new in-memory state store" Mar 21 03:46:08 crc kubenswrapper[4685]: E0321 03:46:08.313896 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.372124 4685 manager.go:334] "Starting Device Plugin manager" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.372209 4685 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.372227 4685 server.go:79] "Starting device plugin registration server" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.372792 4685 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.372814 4685 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.372981 4685 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.373165 4685 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.373190 4685 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 21 03:46:08 crc kubenswrapper[4685]: E0321 03:46:08.386753 4685 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.400183 4685 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.400599 4685 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.402151 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.402206 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.402221 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.402450 4685 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.403608 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.403644 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.403690 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.404697 4685 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.404705 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.404776 4685 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.404884 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.404936 4685 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.405879 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.405910 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.405924 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.406061 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.406115 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.406133 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.407201 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.407238 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.407250 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.407387 4685 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.407518 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.407577 4685 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.408727 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.408785 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.408814 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.410537 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.410595 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.410610 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.410870 4685 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.411224 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.411284 4685 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.412675 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.412728 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.412745 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.412707 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.412848 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.412862 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.413052 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.413075 4685 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.413917 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.413951 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.413966 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:46:08 crc kubenswrapper[4685]: E0321 03:46:08.423581 4685 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" interval="400ms" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.452039 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.452110 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.452159 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.452218 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.452270 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.452321 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.452798 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.452899 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.452948 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.452986 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.453028 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.453064 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.453148 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.453197 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.453226 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.473285 4685 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.475218 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.475351 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.475432 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.475538 4685 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 21 03:46:08 crc kubenswrapper[4685]: E0321 03:46:08.476111 4685 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.158:6443: connect: connection refused" node="crc" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.554620 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.554667 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.554684 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.554701 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.554720 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.554735 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.554749 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.554764 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.554781 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.554797 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.554812 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.554828 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.554863 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.554881 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.554873 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.554935 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.554898 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.554975 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.554999 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.554988 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.555026 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.555056 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.555084 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.555081 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.555119 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.555169 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.555176 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.555284 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.555299 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.555273 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.676474 4685 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.678051 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.678115 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.678128 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.678165 4685 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 21 03:46:08 crc kubenswrapper[4685]: E0321 03:46:08.678739 4685 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.158:6443: connect: connection refused" node="crc" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.729122 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.753947 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.762597 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.786827 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 21 03:46:08 crc kubenswrapper[4685]: I0321 03:46:08.791476 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.797931 4685 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-7500bc92531e2ee58e726481107304735aaa4ad2523a748ae1c19e9c5f4fa5a5 WatchSource:0}: Error finding container 7500bc92531e2ee58e726481107304735aaa4ad2523a748ae1c19e9c5f4fa5a5: Status 404 returned error can't find the container with id 7500bc92531e2ee58e726481107304735aaa4ad2523a748ae1c19e9c5f4fa5a5 Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.816219 4685 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-1cacdd6a782411bdcd12014c7c70934aeb1fd5eba5f849d0e62597964b54e18e WatchSource:0}: Error finding container 1cacdd6a782411bdcd12014c7c70934aeb1fd5eba5f849d0e62597964b54e18e: Status 404 returned error can't find the container with id 1cacdd6a782411bdcd12014c7c70934aeb1fd5eba5f849d0e62597964b54e18e Mar 21 03:46:08 crc kubenswrapper[4685]: W0321 03:46:08.820151 4685 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-969cf93a24e192d69148b4227bffadc547f63358f32c8d8abe9450232f67fa5c WatchSource:0}: Error finding container 969cf93a24e192d69148b4227bffadc547f63358f32c8d8abe9450232f67fa5c: Status 404 returned error can't find the container with id 969cf93a24e192d69148b4227bffadc547f63358f32c8d8abe9450232f67fa5c Mar 21 03:46:08 crc kubenswrapper[4685]: E0321 03:46:08.824820 4685 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" interval="800ms" Mar 21 03:46:08 crc kubenswrapper[4685]: E0321 03:46:08.824917 4685 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.158:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189ebe81dd94074f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:08.208078671 +0000 UTC m=+0.685147493,LastTimestamp:2026-03-21 03:46:08.208078671 +0000 UTC m=+0.685147493,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:09 crc kubenswrapper[4685]: I0321 03:46:09.079799 4685 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 03:46:09 crc kubenswrapper[4685]: I0321 03:46:09.081487 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:46:09 crc kubenswrapper[4685]: I0321 03:46:09.081582 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:46:09 crc kubenswrapper[4685]: I0321 03:46:09.081597 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:46:09 crc kubenswrapper[4685]: I0321 03:46:09.081634 4685 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 21 03:46:09 crc kubenswrapper[4685]: E0321 03:46:09.082329 4685 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.158:6443: connect: connection refused" node="crc" Mar 21 03:46:09 crc kubenswrapper[4685]: W0321 03:46:09.152687 4685 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Mar 21 03:46:09 crc kubenswrapper[4685]: E0321 03:46:09.152804 4685 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.158:6443: connect: connection refused" logger="UnhandledError" Mar 21 03:46:09 crc kubenswrapper[4685]: I0321 03:46:09.210897 4685 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Mar 21 03:46:09 crc kubenswrapper[4685]: I0321 03:46:09.305352 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"7500bc92531e2ee58e726481107304735aaa4ad2523a748ae1c19e9c5f4fa5a5"} Mar 21 03:46:09 crc kubenswrapper[4685]: I0321 03:46:09.306673 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"969cf93a24e192d69148b4227bffadc547f63358f32c8d8abe9450232f67fa5c"} Mar 21 03:46:09 crc kubenswrapper[4685]: I0321 03:46:09.308320 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"1cacdd6a782411bdcd12014c7c70934aeb1fd5eba5f849d0e62597964b54e18e"} Mar 21 03:46:09 crc kubenswrapper[4685]: I0321 03:46:09.309740 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f3887e0794e0e834fa33f5266c66c7483e1c1ab35950d60e2515de0b7739772d"} Mar 21 03:46:09 crc kubenswrapper[4685]: I0321 03:46:09.313469 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9cbf96f4bbfbd6a33b9912f528c5697c774bb5b5776561dfe307eba2dad75992"} Mar 21 03:46:09 crc kubenswrapper[4685]: E0321 03:46:09.626506 4685 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" interval="1.6s" Mar 21 03:46:09 crc kubenswrapper[4685]: W0321 03:46:09.664296 4685 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Mar 21 03:46:09 crc kubenswrapper[4685]: E0321 03:46:09.664396 4685 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.158:6443: connect: connection refused" logger="UnhandledError" Mar 21 03:46:09 crc kubenswrapper[4685]: W0321 03:46:09.665179 4685 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Mar 21 03:46:09 crc kubenswrapper[4685]: E0321 03:46:09.665351 4685 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.158:6443: connect: connection refused" logger="UnhandledError" Mar 21 03:46:09 crc kubenswrapper[4685]: W0321 03:46:09.675773 4685 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Mar 21 03:46:09 crc kubenswrapper[4685]: E0321 03:46:09.675826 4685 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.158:6443: connect: connection refused" logger="UnhandledError" Mar 21 03:46:09 crc kubenswrapper[4685]: I0321 03:46:09.882813 4685 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 03:46:09 crc kubenswrapper[4685]: I0321 03:46:09.884435 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:46:09 crc kubenswrapper[4685]: I0321 03:46:09.884482 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:46:09 crc kubenswrapper[4685]: I0321 03:46:09.884501 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:46:09 crc kubenswrapper[4685]: I0321 03:46:09.884540 4685 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 21 03:46:09 crc kubenswrapper[4685]: E0321 03:46:09.885204 4685 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.158:6443: connect: connection refused" node="crc" Mar 21 03:46:10 crc kubenswrapper[4685]: I0321 03:46:10.130660 4685 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 21 03:46:10 crc kubenswrapper[4685]: E0321 03:46:10.132381 4685 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.158:6443: connect: connection refused" logger="UnhandledError" Mar 21 03:46:10 crc kubenswrapper[4685]: I0321 03:46:10.211169 4685 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Mar 21 03:46:10 crc kubenswrapper[4685]: I0321 03:46:10.318873 4685 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="ec4c4650f99d4949377056894865bf42c7f70a0307ca2c52da91f55d4730b77b" exitCode=0 Mar 21 03:46:10 crc kubenswrapper[4685]: I0321 03:46:10.319049 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"ec4c4650f99d4949377056894865bf42c7f70a0307ca2c52da91f55d4730b77b"} Mar 21 03:46:10 crc kubenswrapper[4685]: I0321 03:46:10.319079 4685 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 03:46:10 crc kubenswrapper[4685]: I0321 03:46:10.320316 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:46:10 crc kubenswrapper[4685]: I0321 03:46:10.320398 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:46:10 crc kubenswrapper[4685]: I0321 03:46:10.320422 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:46:10 crc kubenswrapper[4685]: I0321 03:46:10.321764 4685 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="5af33f2ffea677fec9bcbaa1de7545651b1021c12e6ea778649d9b6b160b6b8e" exitCode=0 Mar 21 03:46:10 crc kubenswrapper[4685]: I0321 03:46:10.321858 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"5af33f2ffea677fec9bcbaa1de7545651b1021c12e6ea778649d9b6b160b6b8e"} Mar 21 03:46:10 crc kubenswrapper[4685]: I0321 03:46:10.321901 4685 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 03:46:10 crc kubenswrapper[4685]: I0321 03:46:10.323788 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:46:10 crc kubenswrapper[4685]: I0321 03:46:10.323830 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:46:10 crc kubenswrapper[4685]: I0321 03:46:10.323966 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:46:10 crc kubenswrapper[4685]: I0321 03:46:10.324441 4685 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="33920b49011815c1b030d555584aee220e8fede3b4f9ff5f8a5f554d6e1d8d02" exitCode=0 Mar 21 03:46:10 crc kubenswrapper[4685]: I0321 03:46:10.324519 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"33920b49011815c1b030d555584aee220e8fede3b4f9ff5f8a5f554d6e1d8d02"} Mar 21 03:46:10 crc kubenswrapper[4685]: I0321 03:46:10.324570 4685 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 03:46:10 crc kubenswrapper[4685]: I0321 03:46:10.326684 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:46:10 crc kubenswrapper[4685]: I0321 03:46:10.326754 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:46:10 crc kubenswrapper[4685]: I0321 03:46:10.326775 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:46:10 crc kubenswrapper[4685]: I0321 03:46:10.334399 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"41664fe1fb74ab3c38846eaf0a2b0bf46f5b0c4a2d2b3dfadb1a5e02e5a66e81"} Mar 21 03:46:10 crc kubenswrapper[4685]: I0321 03:46:10.334487 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"229992b297c1eb9aa6f92e56505bdf819e392fb777636759549854330bf022ab"} Mar 21 03:46:10 crc kubenswrapper[4685]: I0321 03:46:10.334507 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"aa74801f196e57343a98011d671414abf1f1ebf4d7b962be522f0b6cad777acf"} Mar 21 03:46:10 crc kubenswrapper[4685]: I0321 03:46:10.336969 4685 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="52df5756c4d3e8d1087a5a38b3e2b9966fb492d666c43773aa12a4cd83f6158d" exitCode=0 Mar 21 03:46:10 crc kubenswrapper[4685]: I0321 03:46:10.337008 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"52df5756c4d3e8d1087a5a38b3e2b9966fb492d666c43773aa12a4cd83f6158d"} Mar 21 03:46:10 crc kubenswrapper[4685]: I0321 03:46:10.337139 4685 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 03:46:10 crc kubenswrapper[4685]: I0321 03:46:10.338359 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:46:10 crc kubenswrapper[4685]: I0321 03:46:10.338423 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:46:10 crc kubenswrapper[4685]: I0321 03:46:10.338445 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:46:10 crc kubenswrapper[4685]: I0321 03:46:10.345204 4685 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 03:46:10 crc kubenswrapper[4685]: I0321 03:46:10.346621 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:46:10 crc kubenswrapper[4685]: I0321 03:46:10.346652 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:46:10 crc kubenswrapper[4685]: I0321 03:46:10.346662 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:46:10 crc kubenswrapper[4685]: W0321 03:46:10.918350 4685 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Mar 21 03:46:10 crc kubenswrapper[4685]: E0321 03:46:10.918489 4685 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.158:6443: connect: connection refused" logger="UnhandledError" Mar 21 03:46:11 crc kubenswrapper[4685]: I0321 03:46:11.211599 4685 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Mar 21 03:46:11 crc kubenswrapper[4685]: E0321 03:46:11.229186 4685 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" interval="3.2s" Mar 21 03:46:11 crc kubenswrapper[4685]: W0321 03:46:11.318200 4685 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Mar 21 03:46:11 crc kubenswrapper[4685]: E0321 03:46:11.318334 4685 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.158:6443: connect: connection refused" logger="UnhandledError" Mar 21 03:46:11 crc kubenswrapper[4685]: I0321 03:46:11.342924 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"6f2aa64984fe04680e1c00621e521e1a2a6eb5c2c2696c88fee33cc6eaa528f6"} Mar 21 03:46:11 crc kubenswrapper[4685]: I0321 03:46:11.343012 4685 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 03:46:11 crc kubenswrapper[4685]: I0321 03:46:11.344375 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:46:11 crc kubenswrapper[4685]: I0321 03:46:11.344417 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:46:11 crc kubenswrapper[4685]: I0321 03:46:11.344434 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:46:11 crc kubenswrapper[4685]: I0321 03:46:11.346570 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"182ad351b6c632ea64087d4784ea919d5c21165dc5d0373fa35db7f7f1eea435"} Mar 21 03:46:11 crc kubenswrapper[4685]: I0321 03:46:11.346600 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"60c96edd458d05f217a2e9f07a44bd221303d821a790382a82cff0b912d48f63"} Mar 21 03:46:11 crc kubenswrapper[4685]: I0321 03:46:11.346630 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a8583e9b5dff82d5df52e281ba4069e9259b1c8fe3d1b8121d0e9f3f9e97d47b"} Mar 21 03:46:11 crc kubenswrapper[4685]: I0321 03:46:11.346641 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ba587c4fe2f05966282b50ba5236b9f3d9ef6de63f72c70ae9f7a5222cb8b904"} Mar 21 03:46:11 crc kubenswrapper[4685]: I0321 03:46:11.349226 4685 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="9fdec8282c7f8bf7689534ffdf21d26aadc5e121210d117c2cca637ae2714dc4" exitCode=0 Mar 21 03:46:11 crc kubenswrapper[4685]: I0321 03:46:11.349330 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"9fdec8282c7f8bf7689534ffdf21d26aadc5e121210d117c2cca637ae2714dc4"} Mar 21 03:46:11 crc kubenswrapper[4685]: I0321 03:46:11.349441 4685 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 03:46:11 crc kubenswrapper[4685]: I0321 03:46:11.350640 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:46:11 crc kubenswrapper[4685]: I0321 03:46:11.350682 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:46:11 crc kubenswrapper[4685]: I0321 03:46:11.350699 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:46:11 crc kubenswrapper[4685]: I0321 03:46:11.353332 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"7a85a4c157d29725897cd0fd271fa9844551cbb95dbc21e085684115987d8e3d"} Mar 21 03:46:11 crc kubenswrapper[4685]: I0321 03:46:11.353351 4685 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 03:46:11 crc kubenswrapper[4685]: I0321 03:46:11.354455 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:46:11 crc kubenswrapper[4685]: I0321 03:46:11.354479 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:46:11 crc kubenswrapper[4685]: I0321 03:46:11.354488 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:46:11 crc kubenswrapper[4685]: I0321 03:46:11.356483 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"405513730b33c9e9981a2c99d2e2c5897042c1a395d9fc099ad6818a6352cb2b"} Mar 21 03:46:11 crc kubenswrapper[4685]: I0321 03:46:11.356545 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"2fe44d24bf93a4fdf116cd039951b919f771648e1a8c53df1507569b821dc8aa"} Mar 21 03:46:11 crc kubenswrapper[4685]: I0321 03:46:11.356563 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"0b6e897d496eb8cc32a4f3c51a65335a9594f50c7010a0f022f54722edfd38e1"} Mar 21 03:46:11 crc kubenswrapper[4685]: I0321 03:46:11.356682 4685 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 03:46:11 crc kubenswrapper[4685]: I0321 03:46:11.357790 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:46:11 crc kubenswrapper[4685]: I0321 03:46:11.357825 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:46:11 crc kubenswrapper[4685]: I0321 03:46:11.357863 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:46:11 crc kubenswrapper[4685]: I0321 03:46:11.485711 4685 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 03:46:11 crc kubenswrapper[4685]: I0321 03:46:11.487149 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:46:11 crc kubenswrapper[4685]: I0321 03:46:11.487233 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:46:11 crc kubenswrapper[4685]: I0321 03:46:11.487245 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:46:11 crc kubenswrapper[4685]: I0321 03:46:11.487280 4685 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 21 03:46:11 crc kubenswrapper[4685]: E0321 03:46:11.488823 4685 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.158:6443: connect: connection refused" node="crc" Mar 21 03:46:11 crc kubenswrapper[4685]: W0321 03:46:11.718466 4685 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Mar 21 03:46:11 crc kubenswrapper[4685]: E0321 03:46:11.718578 4685 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.158:6443: connect: connection refused" logger="UnhandledError" Mar 21 03:46:12 crc kubenswrapper[4685]: I0321 03:46:12.364234 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"635a203890cbbc6d9e5af8002033df35908fba9abc4b67d9a79657261f9c7949"} Mar 21 03:46:12 crc kubenswrapper[4685]: I0321 03:46:12.365347 4685 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 03:46:12 crc kubenswrapper[4685]: I0321 03:46:12.366542 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:46:12 crc kubenswrapper[4685]: I0321 03:46:12.366696 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:46:12 crc kubenswrapper[4685]: I0321 03:46:12.366776 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:46:12 crc kubenswrapper[4685]: I0321 03:46:12.369398 4685 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="27369c227e083d4d1256d5e76ae4791b7246404b518c96b026f3e6b456d3b5bd" exitCode=0 Mar 21 03:46:12 crc kubenswrapper[4685]: I0321 03:46:12.369577 4685 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 03:46:12 crc kubenswrapper[4685]: I0321 03:46:12.370144 4685 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 03:46:12 crc kubenswrapper[4685]: I0321 03:46:12.370591 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"27369c227e083d4d1256d5e76ae4791b7246404b518c96b026f3e6b456d3b5bd"} Mar 21 03:46:12 crc kubenswrapper[4685]: I0321 03:46:12.370748 4685 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 03:46:12 crc kubenswrapper[4685]: I0321 03:46:12.371247 4685 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 03:46:12 crc kubenswrapper[4685]: I0321 03:46:12.371767 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 21 03:46:12 crc kubenswrapper[4685]: I0321 03:46:12.372339 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:46:12 crc kubenswrapper[4685]: I0321 03:46:12.372434 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:46:12 crc kubenswrapper[4685]: I0321 03:46:12.372508 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:46:12 crc kubenswrapper[4685]: I0321 03:46:12.373089 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:46:12 crc kubenswrapper[4685]: I0321 03:46:12.373266 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:46:12 crc kubenswrapper[4685]: I0321 03:46:12.373366 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:46:12 crc kubenswrapper[4685]: I0321 03:46:12.373243 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:46:12 crc kubenswrapper[4685]: I0321 03:46:12.373568 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:46:12 crc kubenswrapper[4685]: I0321 03:46:12.373589 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:46:12 crc kubenswrapper[4685]: I0321 03:46:12.373389 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:46:12 crc kubenswrapper[4685]: I0321 03:46:12.373645 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:46:12 crc kubenswrapper[4685]: I0321 03:46:12.373660 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:46:13 crc kubenswrapper[4685]: I0321 03:46:13.112298 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 03:46:13 crc kubenswrapper[4685]: I0321 03:46:13.113783 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 03:46:13 crc kubenswrapper[4685]: I0321 03:46:13.380791 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"fa335fcef30e28379cd8fdec5d44d8ba3f93fe5efe173c61c12f8f5fd26a45d3"} Mar 21 03:46:13 crc kubenswrapper[4685]: I0321 03:46:13.380898 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c62e0118a8723a69e3bc62e2694f4aaede4801a3a418290d7a88f21bad87ec7d"} Mar 21 03:46:13 crc kubenswrapper[4685]: I0321 03:46:13.380918 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"9569910899cde040b1c240b1312fc72568bd8c0021cce7f7867b1c48450bdd83"} Mar 21 03:46:13 crc kubenswrapper[4685]: I0321 03:46:13.380960 4685 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 03:46:13 crc kubenswrapper[4685]: I0321 03:46:13.381051 4685 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 03:46:13 crc kubenswrapper[4685]: I0321 03:46:13.382812 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:46:13 crc kubenswrapper[4685]: I0321 03:46:13.382901 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:46:13 crc kubenswrapper[4685]: I0321 03:46:13.382922 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:46:13 crc kubenswrapper[4685]: I0321 03:46:13.384321 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:46:13 crc kubenswrapper[4685]: I0321 03:46:13.384386 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:46:13 crc kubenswrapper[4685]: I0321 03:46:13.384410 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:46:13 crc kubenswrapper[4685]: I0321 03:46:13.798638 4685 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 03:46:13 crc kubenswrapper[4685]: I0321 03:46:13.798956 4685 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 03:46:13 crc kubenswrapper[4685]: I0321 03:46:13.801243 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:46:13 crc kubenswrapper[4685]: I0321 03:46:13.801296 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:46:13 crc kubenswrapper[4685]: I0321 03:46:13.801314 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:46:13 crc kubenswrapper[4685]: I0321 03:46:13.810352 4685 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 03:46:14 crc kubenswrapper[4685]: I0321 03:46:14.092995 4685 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 03:46:14 crc kubenswrapper[4685]: I0321 03:46:14.141669 4685 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 21 03:46:14 crc kubenswrapper[4685]: I0321 03:46:14.390141 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"06cc4c770a46e9e459a9c474946b3ce02795f4b9eeae9f1bed7d5f9ac65320f0"} Mar 21 03:46:14 crc kubenswrapper[4685]: I0321 03:46:14.390226 4685 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 03:46:14 crc kubenswrapper[4685]: I0321 03:46:14.390246 4685 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 03:46:14 crc kubenswrapper[4685]: I0321 03:46:14.390337 4685 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 03:46:14 crc kubenswrapper[4685]: I0321 03:46:14.390237 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"7b977ccd5733667251363a579d3870b937ed30a3723b079f215a3be2ace9d4b9"} Mar 21 03:46:14 crc kubenswrapper[4685]: I0321 03:46:14.392102 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:46:14 crc kubenswrapper[4685]: I0321 03:46:14.392153 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:46:14 crc kubenswrapper[4685]: I0321 03:46:14.392173 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:46:14 crc kubenswrapper[4685]: I0321 03:46:14.392194 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:46:14 crc kubenswrapper[4685]: I0321 03:46:14.392220 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:46:14 crc kubenswrapper[4685]: I0321 03:46:14.392199 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:46:14 crc kubenswrapper[4685]: I0321 03:46:14.392217 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:46:14 crc kubenswrapper[4685]: I0321 03:46:14.392297 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:46:14 crc kubenswrapper[4685]: I0321 03:46:14.392320 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:46:14 crc kubenswrapper[4685]: I0321 03:46:14.688993 4685 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 03:46:14 crc kubenswrapper[4685]: I0321 03:46:14.690967 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:46:14 crc kubenswrapper[4685]: I0321 03:46:14.691034 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:46:14 crc kubenswrapper[4685]: I0321 03:46:14.691053 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:46:14 crc kubenswrapper[4685]: I0321 03:46:14.691091 4685 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 21 03:46:14 crc kubenswrapper[4685]: I0321 03:46:14.867626 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 03:46:15 crc kubenswrapper[4685]: I0321 03:46:15.393708 4685 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 03:46:15 crc kubenswrapper[4685]: I0321 03:46:15.393743 4685 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 03:46:15 crc kubenswrapper[4685]: I0321 03:46:15.393811 4685 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 03:46:15 crc kubenswrapper[4685]: I0321 03:46:15.395541 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:46:15 crc kubenswrapper[4685]: I0321 03:46:15.395595 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:46:15 crc kubenswrapper[4685]: I0321 03:46:15.395611 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:46:15 crc kubenswrapper[4685]: I0321 03:46:15.395568 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:46:15 crc kubenswrapper[4685]: I0321 03:46:15.395669 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:46:15 crc kubenswrapper[4685]: I0321 03:46:15.395684 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:46:15 crc kubenswrapper[4685]: I0321 03:46:15.395616 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:46:15 crc kubenswrapper[4685]: I0321 03:46:15.395866 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:46:15 crc kubenswrapper[4685]: I0321 03:46:15.395883 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:46:16 crc kubenswrapper[4685]: I0321 03:46:16.582571 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 21 03:46:16 crc kubenswrapper[4685]: I0321 03:46:16.582936 4685 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 03:46:16 crc kubenswrapper[4685]: I0321 03:46:16.584757 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:46:16 crc kubenswrapper[4685]: I0321 03:46:16.584817 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:46:16 crc kubenswrapper[4685]: I0321 03:46:16.584885 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:46:16 crc kubenswrapper[4685]: I0321 03:46:16.921618 4685 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 21 03:46:17 crc kubenswrapper[4685]: I0321 03:46:17.399028 4685 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 03:46:17 crc kubenswrapper[4685]: I0321 03:46:17.400229 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:46:17 crc kubenswrapper[4685]: I0321 03:46:17.400278 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:46:17 crc kubenswrapper[4685]: I0321 03:46:17.400290 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:46:18 crc kubenswrapper[4685]: I0321 03:46:18.078163 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 03:46:18 crc kubenswrapper[4685]: I0321 03:46:18.078369 4685 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 03:46:18 crc kubenswrapper[4685]: I0321 03:46:18.079767 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:46:18 crc kubenswrapper[4685]: I0321 03:46:18.079878 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:46:18 crc kubenswrapper[4685]: I0321 03:46:18.079905 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:46:18 crc kubenswrapper[4685]: E0321 03:46:18.387003 4685 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 21 03:46:19 crc kubenswrapper[4685]: I0321 03:46:19.465633 4685 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 03:46:19 crc kubenswrapper[4685]: I0321 03:46:19.466312 4685 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 03:46:19 crc kubenswrapper[4685]: I0321 03:46:19.468575 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:46:19 crc kubenswrapper[4685]: I0321 03:46:19.468731 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:46:19 crc kubenswrapper[4685]: I0321 03:46:19.468754 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:46:19 crc kubenswrapper[4685]: I0321 03:46:19.472902 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 03:46:20 crc kubenswrapper[4685]: I0321 03:46:20.407582 4685 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 03:46:20 crc kubenswrapper[4685]: I0321 03:46:20.410027 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:46:20 crc kubenswrapper[4685]: I0321 03:46:20.410091 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:46:20 crc kubenswrapper[4685]: I0321 03:46:20.410111 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:46:22 crc kubenswrapper[4685]: I0321 03:46:22.212571 4685 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Mar 21 03:46:22 crc kubenswrapper[4685]: W0321 03:46:22.407471 4685 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 21 03:46:22 crc kubenswrapper[4685]: I0321 03:46:22.407639 4685 trace.go:236] Trace[309125798]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (21-Mar-2026 03:46:12.406) (total time: 10001ms): Mar 21 03:46:22 crc kubenswrapper[4685]: Trace[309125798]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (03:46:22.407) Mar 21 03:46:22 crc kubenswrapper[4685]: Trace[309125798]: [10.001237518s] [10.001237518s] END Mar 21 03:46:22 crc kubenswrapper[4685]: E0321 03:46:22.407688 4685 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 21 03:46:22 crc kubenswrapper[4685]: I0321 03:46:22.466379 4685 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 21 03:46:22 crc kubenswrapper[4685]: I0321 03:46:22.466540 4685 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 21 03:46:23 crc kubenswrapper[4685]: E0321 03:46:23.081570 4685 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:46:23Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 21 03:46:23 crc kubenswrapper[4685]: E0321 03:46:23.083968 4685 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:46:23Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 21 03:46:23 crc kubenswrapper[4685]: E0321 03:46:23.086893 4685 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:46:23Z is after 2026-02-23T05:33:13Z" node="crc" Mar 21 03:46:23 crc kubenswrapper[4685]: I0321 03:46:23.090950 4685 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 21 03:46:23 crc kubenswrapper[4685]: I0321 03:46:23.091055 4685 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 21 03:46:23 crc kubenswrapper[4685]: W0321 03:46:23.091099 4685 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:46:23Z is after 2026-02-23T05:33:13Z Mar 21 03:46:23 crc kubenswrapper[4685]: E0321 03:46:23.091225 4685 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:46:23Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 21 03:46:23 crc kubenswrapper[4685]: W0321 03:46:23.093312 4685 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:46:23Z is after 2026-02-23T05:33:13Z Mar 21 03:46:23 crc kubenswrapper[4685]: E0321 03:46:23.093394 4685 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:46:23Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 21 03:46:23 crc kubenswrapper[4685]: E0321 03:46:23.097753 4685 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:46:23Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189ebe81dd94074f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:08.208078671 +0000 UTC m=+0.685147493,LastTimestamp:2026-03-21 03:46:08.208078671 +0000 UTC m=+0.685147493,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:23 crc kubenswrapper[4685]: W0321 03:46:23.098199 4685 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:46:23Z is after 2026-02-23T05:33:13Z Mar 21 03:46:23 crc kubenswrapper[4685]: E0321 03:46:23.098338 4685 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:46:23Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 21 03:46:23 crc kubenswrapper[4685]: I0321 03:46:23.113940 4685 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Mar 21 03:46:23 crc kubenswrapper[4685]: I0321 03:46:23.114083 4685 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Mar 21 03:46:23 crc kubenswrapper[4685]: I0321 03:46:23.116952 4685 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 21 03:46:23 crc kubenswrapper[4685]: I0321 03:46:23.117052 4685 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 21 03:46:23 crc kubenswrapper[4685]: I0321 03:46:23.214776 4685 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:46:23Z is after 2026-02-23T05:33:13Z Mar 21 03:46:23 crc kubenswrapper[4685]: I0321 03:46:23.418893 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 21 03:46:23 crc kubenswrapper[4685]: I0321 03:46:23.421392 4685 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="635a203890cbbc6d9e5af8002033df35908fba9abc4b67d9a79657261f9c7949" exitCode=255 Mar 21 03:46:23 crc kubenswrapper[4685]: I0321 03:46:23.421459 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"635a203890cbbc6d9e5af8002033df35908fba9abc4b67d9a79657261f9c7949"} Mar 21 03:46:23 crc kubenswrapper[4685]: I0321 03:46:23.421686 4685 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 03:46:23 crc kubenswrapper[4685]: I0321 03:46:23.422787 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:46:23 crc kubenswrapper[4685]: I0321 03:46:23.422848 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:46:23 crc kubenswrapper[4685]: I0321 03:46:23.422863 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:46:23 crc kubenswrapper[4685]: I0321 03:46:23.423695 4685 scope.go:117] "RemoveContainer" containerID="635a203890cbbc6d9e5af8002033df35908fba9abc4b67d9a79657261f9c7949" Mar 21 03:46:24 crc kubenswrapper[4685]: I0321 03:46:24.111147 4685 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 21 03:46:24 crc kubenswrapper[4685]: [+]log ok Mar 21 03:46:24 crc kubenswrapper[4685]: [+]etcd ok Mar 21 03:46:24 crc kubenswrapper[4685]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 21 03:46:24 crc kubenswrapper[4685]: [+]poststarthook/openshift.io-api-request-count-filter ok Mar 21 03:46:24 crc kubenswrapper[4685]: [+]poststarthook/openshift.io-startkubeinformers ok Mar 21 03:46:24 crc kubenswrapper[4685]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Mar 21 03:46:24 crc kubenswrapper[4685]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Mar 21 03:46:24 crc kubenswrapper[4685]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 21 03:46:24 crc kubenswrapper[4685]: [+]poststarthook/generic-apiserver-start-informers ok Mar 21 03:46:24 crc kubenswrapper[4685]: [+]poststarthook/priority-and-fairness-config-consumer ok Mar 21 03:46:24 crc kubenswrapper[4685]: [+]poststarthook/priority-and-fairness-filter ok Mar 21 03:46:24 crc kubenswrapper[4685]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 21 03:46:24 crc kubenswrapper[4685]: [+]poststarthook/start-apiextensions-informers ok Mar 21 03:46:24 crc kubenswrapper[4685]: [+]poststarthook/start-apiextensions-controllers ok Mar 21 03:46:24 crc kubenswrapper[4685]: [+]poststarthook/crd-informer-synced ok Mar 21 03:46:24 crc kubenswrapper[4685]: [+]poststarthook/start-system-namespaces-controller ok Mar 21 03:46:24 crc kubenswrapper[4685]: [+]poststarthook/start-cluster-authentication-info-controller ok Mar 21 03:46:24 crc kubenswrapper[4685]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Mar 21 03:46:24 crc kubenswrapper[4685]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Mar 21 03:46:24 crc kubenswrapper[4685]: [+]poststarthook/start-legacy-token-tracking-controller ok Mar 21 03:46:24 crc kubenswrapper[4685]: [+]poststarthook/start-service-ip-repair-controllers ok Mar 21 03:46:24 crc kubenswrapper[4685]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Mar 21 03:46:24 crc kubenswrapper[4685]: [+]poststarthook/scheduling/bootstrap-system-priority-classes ok Mar 21 03:46:24 crc kubenswrapper[4685]: [+]poststarthook/priority-and-fairness-config-producer ok Mar 21 03:46:24 crc kubenswrapper[4685]: [+]poststarthook/bootstrap-controller ok Mar 21 03:46:24 crc kubenswrapper[4685]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Mar 21 03:46:24 crc kubenswrapper[4685]: [+]poststarthook/start-kube-aggregator-informers ok Mar 21 03:46:24 crc kubenswrapper[4685]: [+]poststarthook/apiservice-status-local-available-controller ok Mar 21 03:46:24 crc kubenswrapper[4685]: [+]poststarthook/apiservice-status-remote-available-controller ok Mar 21 03:46:24 crc kubenswrapper[4685]: [+]poststarthook/apiservice-registration-controller ok Mar 21 03:46:24 crc kubenswrapper[4685]: [+]poststarthook/apiservice-wait-for-first-sync ok Mar 21 03:46:24 crc kubenswrapper[4685]: [+]poststarthook/apiservice-discovery-controller ok Mar 21 03:46:24 crc kubenswrapper[4685]: [+]poststarthook/kube-apiserver-autoregistration ok Mar 21 03:46:24 crc kubenswrapper[4685]: [+]autoregister-completion ok Mar 21 03:46:24 crc kubenswrapper[4685]: [+]poststarthook/apiservice-openapi-controller ok Mar 21 03:46:24 crc kubenswrapper[4685]: [+]poststarthook/apiservice-openapiv3-controller ok Mar 21 03:46:24 crc kubenswrapper[4685]: livez check failed Mar 21 03:46:24 crc kubenswrapper[4685]: I0321 03:46:24.112636 4685 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 03:46:24 crc kubenswrapper[4685]: I0321 03:46:24.213917 4685 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:46:24Z is after 2026-02-23T05:33:13Z Mar 21 03:46:24 crc kubenswrapper[4685]: I0321 03:46:24.405780 4685 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 03:46:24 crc kubenswrapper[4685]: I0321 03:46:24.428402 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 21 03:46:24 crc kubenswrapper[4685]: I0321 03:46:24.431066 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"1c24ed6bed5931f89f0ee754ab1515937df7ecbee9152bacfd444595e8360e9c"} Mar 21 03:46:24 crc kubenswrapper[4685]: I0321 03:46:24.431225 4685 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 03:46:24 crc kubenswrapper[4685]: I0321 03:46:24.432459 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:46:24 crc kubenswrapper[4685]: I0321 03:46:24.432658 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:46:24 crc kubenswrapper[4685]: I0321 03:46:24.432796 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:46:25 crc kubenswrapper[4685]: I0321 03:46:25.214725 4685 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:46:25Z is after 2026-02-23T05:33:13Z Mar 21 03:46:25 crc kubenswrapper[4685]: I0321 03:46:25.438051 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 21 03:46:25 crc kubenswrapper[4685]: I0321 03:46:25.438729 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 21 03:46:25 crc kubenswrapper[4685]: I0321 03:46:25.440944 4685 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="1c24ed6bed5931f89f0ee754ab1515937df7ecbee9152bacfd444595e8360e9c" exitCode=255 Mar 21 03:46:25 crc kubenswrapper[4685]: I0321 03:46:25.441079 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"1c24ed6bed5931f89f0ee754ab1515937df7ecbee9152bacfd444595e8360e9c"} Mar 21 03:46:25 crc kubenswrapper[4685]: I0321 03:46:25.441198 4685 scope.go:117] "RemoveContainer" containerID="635a203890cbbc6d9e5af8002033df35908fba9abc4b67d9a79657261f9c7949" Mar 21 03:46:25 crc kubenswrapper[4685]: I0321 03:46:25.441442 4685 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 03:46:25 crc kubenswrapper[4685]: I0321 03:46:25.442861 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:46:25 crc kubenswrapper[4685]: I0321 03:46:25.442951 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:46:25 crc kubenswrapper[4685]: I0321 03:46:25.442973 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:46:25 crc kubenswrapper[4685]: I0321 03:46:25.443916 4685 scope.go:117] "RemoveContainer" containerID="1c24ed6bed5931f89f0ee754ab1515937df7ecbee9152bacfd444595e8360e9c" Mar 21 03:46:25 crc kubenswrapper[4685]: E0321 03:46:25.444329 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 21 03:46:26 crc kubenswrapper[4685]: I0321 03:46:26.216703 4685 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:46:26Z is after 2026-02-23T05:33:13Z Mar 21 03:46:26 crc kubenswrapper[4685]: I0321 03:46:26.449611 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 21 03:46:26 crc kubenswrapper[4685]: I0321 03:46:26.453199 4685 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 03:46:26 crc kubenswrapper[4685]: I0321 03:46:26.454457 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:46:26 crc kubenswrapper[4685]: I0321 03:46:26.454558 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:46:26 crc kubenswrapper[4685]: I0321 03:46:26.454582 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:46:26 crc kubenswrapper[4685]: I0321 03:46:26.455732 4685 scope.go:117] "RemoveContainer" containerID="1c24ed6bed5931f89f0ee754ab1515937df7ecbee9152bacfd444595e8360e9c" Mar 21 03:46:26 crc kubenswrapper[4685]: E0321 03:46:26.456054 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 21 03:46:26 crc kubenswrapper[4685]: I0321 03:46:26.965124 4685 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 21 03:46:26 crc kubenswrapper[4685]: I0321 03:46:26.965906 4685 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 03:46:26 crc kubenswrapper[4685]: W0321 03:46:26.967261 4685 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:46:26Z is after 2026-02-23T05:33:13Z Mar 21 03:46:26 crc kubenswrapper[4685]: E0321 03:46:26.967384 4685 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:46:26Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 21 03:46:26 crc kubenswrapper[4685]: I0321 03:46:26.968157 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:46:26 crc kubenswrapper[4685]: I0321 03:46:26.968233 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:46:26 crc kubenswrapper[4685]: I0321 03:46:26.968254 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:46:26 crc kubenswrapper[4685]: I0321 03:46:26.988805 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 21 03:46:27 crc kubenswrapper[4685]: I0321 03:46:27.214226 4685 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:46:27Z is after 2026-02-23T05:33:13Z Mar 21 03:46:27 crc kubenswrapper[4685]: I0321 03:46:27.455523 4685 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 03:46:27 crc kubenswrapper[4685]: I0321 03:46:27.456861 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:46:27 crc kubenswrapper[4685]: I0321 03:46:27.456905 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:46:27 crc kubenswrapper[4685]: I0321 03:46:27.456915 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:46:28 crc kubenswrapper[4685]: I0321 03:46:28.215824 4685 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:46:28Z is after 2026-02-23T05:33:13Z Mar 21 03:46:28 crc kubenswrapper[4685]: E0321 03:46:28.388061 4685 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 21 03:46:29 crc kubenswrapper[4685]: I0321 03:46:29.103199 4685 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 03:46:29 crc kubenswrapper[4685]: I0321 03:46:29.105511 4685 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 03:46:29 crc kubenswrapper[4685]: I0321 03:46:29.107681 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:46:29 crc kubenswrapper[4685]: I0321 03:46:29.107740 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:46:29 crc kubenswrapper[4685]: I0321 03:46:29.107759 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:46:29 crc kubenswrapper[4685]: I0321 03:46:29.108808 4685 scope.go:117] "RemoveContainer" containerID="1c24ed6bed5931f89f0ee754ab1515937df7ecbee9152bacfd444595e8360e9c" Mar 21 03:46:29 crc kubenswrapper[4685]: E0321 03:46:29.109172 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 21 03:46:29 crc kubenswrapper[4685]: I0321 03:46:29.112786 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 03:46:29 crc kubenswrapper[4685]: I0321 03:46:29.216386 4685 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:46:29Z is after 2026-02-23T05:33:13Z Mar 21 03:46:29 crc kubenswrapper[4685]: I0321 03:46:29.461298 4685 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 03:46:29 crc kubenswrapper[4685]: I0321 03:46:29.462381 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:46:29 crc kubenswrapper[4685]: I0321 03:46:29.462429 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:46:29 crc kubenswrapper[4685]: I0321 03:46:29.462447 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:46:29 crc kubenswrapper[4685]: I0321 03:46:29.463426 4685 scope.go:117] "RemoveContainer" containerID="1c24ed6bed5931f89f0ee754ab1515937df7ecbee9152bacfd444595e8360e9c" Mar 21 03:46:29 crc kubenswrapper[4685]: E0321 03:46:29.463766 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 21 03:46:29 crc kubenswrapper[4685]: I0321 03:46:29.487647 4685 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 03:46:29 crc kubenswrapper[4685]: E0321 03:46:29.487651 4685 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:46:29Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 21 03:46:29 crc kubenswrapper[4685]: I0321 03:46:29.489464 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:46:29 crc kubenswrapper[4685]: I0321 03:46:29.489507 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:46:29 crc kubenswrapper[4685]: I0321 03:46:29.489522 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:46:29 crc kubenswrapper[4685]: I0321 03:46:29.489552 4685 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 21 03:46:29 crc kubenswrapper[4685]: E0321 03:46:29.492331 4685 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:46:29Z is after 2026-02-23T05:33:13Z" node="crc" Mar 21 03:46:30 crc kubenswrapper[4685]: W0321 03:46:30.040038 4685 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 21 03:46:30 crc kubenswrapper[4685]: E0321 03:46:30.040147 4685 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 21 03:46:30 crc kubenswrapper[4685]: I0321 03:46:30.217070 4685 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 03:46:31 crc kubenswrapper[4685]: W0321 03:46:31.090983 4685 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 21 03:46:31 crc kubenswrapper[4685]: E0321 03:46:31.091069 4685 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 21 03:46:31 crc kubenswrapper[4685]: I0321 03:46:31.217343 4685 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 03:46:31 crc kubenswrapper[4685]: I0321 03:46:31.604362 4685 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 21 03:46:31 crc kubenswrapper[4685]: I0321 03:46:31.632213 4685 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 21 03:46:32 crc kubenswrapper[4685]: I0321 03:46:32.216259 4685 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 03:46:32 crc kubenswrapper[4685]: I0321 03:46:32.466234 4685 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 21 03:46:32 crc kubenswrapper[4685]: I0321 03:46:32.466385 4685 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.102144 4685 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ebe81dd94074f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:08.208078671 +0000 UTC m=+0.685147493,LastTimestamp:2026-03-21 03:46:08.208078671 +0000 UTC m=+0.685147493,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: I0321 03:46:33.112377 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.112501 4685 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ebe81e28eb79f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:08.291616671 +0000 UTC m=+0.768685473,LastTimestamp:2026-03-21 03:46:08.291616671 +0000 UTC m=+0.768685473,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: I0321 03:46:33.113337 4685 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 03:46:33 crc kubenswrapper[4685]: I0321 03:46:33.115621 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:46:33 crc kubenswrapper[4685]: I0321 03:46:33.115953 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:46:33 crc kubenswrapper[4685]: I0321 03:46:33.116176 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:46:33 crc kubenswrapper[4685]: I0321 03:46:33.117611 4685 scope.go:117] "RemoveContainer" containerID="1c24ed6bed5931f89f0ee754ab1515937df7ecbee9152bacfd444595e8360e9c" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.118175 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.119382 4685 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ebe81e28f1f5d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:08.291643229 +0000 UTC m=+0.768712031,LastTimestamp:2026-03-21 03:46:08.291643229 +0000 UTC m=+0.768712031,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.120997 4685 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ebe81e28f5057 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:08.291655767 +0000 UTC m=+0.768724569,LastTimestamp:2026-03-21 03:46:08.291655767 +0000 UTC m=+0.768724569,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.128214 4685 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ebe81e7ab2b86 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:08.37736743 +0000 UTC m=+0.854436232,LastTimestamp:2026-03-21 03:46:08.37736743 +0000 UTC m=+0.854436232,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.136570 4685 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ebe81e28eb79f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ebe81e28eb79f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:08.291616671 +0000 UTC m=+0.768685473,LastTimestamp:2026-03-21 03:46:08.40218729 +0000 UTC m=+0.879256092,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.143847 4685 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ebe81e28f1f5d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ebe81e28f1f5d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:08.291643229 +0000 UTC m=+0.768712031,LastTimestamp:2026-03-21 03:46:08.402216257 +0000 UTC m=+0.879285059,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.152080 4685 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ebe81e28f5057\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ebe81e28f5057 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:08.291655767 +0000 UTC m=+0.768724569,LastTimestamp:2026-03-21 03:46:08.402227826 +0000 UTC m=+0.879296628,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.159749 4685 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ebe81e28eb79f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ebe81e28eb79f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:08.291616671 +0000 UTC m=+0.768685473,LastTimestamp:2026-03-21 03:46:08.403631346 +0000 UTC m=+0.880700158,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.165416 4685 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ebe81e28f1f5d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ebe81e28f1f5d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:08.291643229 +0000 UTC m=+0.768712031,LastTimestamp:2026-03-21 03:46:08.403683851 +0000 UTC m=+0.880752653,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.171225 4685 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ebe81e28f5057\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ebe81e28f5057 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:08.291655767 +0000 UTC m=+0.768724569,LastTimestamp:2026-03-21 03:46:08.40369798 +0000 UTC m=+0.880766792,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.178407 4685 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ebe81e28eb79f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ebe81e28eb79f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:08.291616671 +0000 UTC m=+0.768685473,LastTimestamp:2026-03-21 03:46:08.405903301 +0000 UTC m=+0.882972103,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.185680 4685 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ebe81e28f1f5d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ebe81e28f1f5d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:08.291643229 +0000 UTC m=+0.768712031,LastTimestamp:2026-03-21 03:46:08.40591842 +0000 UTC m=+0.882987222,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.192955 4685 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ebe81e28f5057\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ebe81e28f5057 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:08.291655767 +0000 UTC m=+0.768724569,LastTimestamp:2026-03-21 03:46:08.405930599 +0000 UTC m=+0.882999401,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.200078 4685 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ebe81e28eb79f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ebe81e28eb79f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:08.291616671 +0000 UTC m=+0.768685473,LastTimestamp:2026-03-21 03:46:08.406089135 +0000 UTC m=+0.883157957,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.209636 4685 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ebe81e28f1f5d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ebe81e28f1f5d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:08.291643229 +0000 UTC m=+0.768712031,LastTimestamp:2026-03-21 03:46:08.406126662 +0000 UTC m=+0.883195484,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: I0321 03:46:33.216006 4685 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.216053 4685 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ebe81e28f5057\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ebe81e28f5057 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:08.291655767 +0000 UTC m=+0.768724569,LastTimestamp:2026-03-21 03:46:08.40614342 +0000 UTC m=+0.883212242,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.217539 4685 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ebe81e28eb79f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ebe81e28eb79f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:08.291616671 +0000 UTC m=+0.768685473,LastTimestamp:2026-03-21 03:46:08.407223838 +0000 UTC m=+0.884292640,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.224883 4685 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ebe81e28f1f5d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ebe81e28f1f5d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:08.291643229 +0000 UTC m=+0.768712031,LastTimestamp:2026-03-21 03:46:08.407245616 +0000 UTC m=+0.884314418,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.231797 4685 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ebe81e28f5057\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ebe81e28f5057 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:08.291655767 +0000 UTC m=+0.768724569,LastTimestamp:2026-03-21 03:46:08.407257215 +0000 UTC m=+0.884326017,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.237638 4685 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ebe81e28eb79f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ebe81e28eb79f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:08.291616671 +0000 UTC m=+0.768685473,LastTimestamp:2026-03-21 03:46:08.408760596 +0000 UTC m=+0.885829428,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.245229 4685 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ebe81e28f1f5d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ebe81e28f1f5d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:08.291643229 +0000 UTC m=+0.768712031,LastTimestamp:2026-03-21 03:46:08.408800142 +0000 UTC m=+0.885868974,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.251727 4685 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ebe81e28f5057\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ebe81e28f5057 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:08.291655767 +0000 UTC m=+0.768724569,LastTimestamp:2026-03-21 03:46:08.40882771 +0000 UTC m=+0.885896542,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.259580 4685 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ebe81e28eb79f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ebe81e28eb79f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:08.291616671 +0000 UTC m=+0.768685473,LastTimestamp:2026-03-21 03:46:08.410580189 +0000 UTC m=+0.887649001,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.266364 4685 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ebe81e28f1f5d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ebe81e28f1f5d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:08.291643229 +0000 UTC m=+0.768712031,LastTimestamp:2026-03-21 03:46:08.410603737 +0000 UTC m=+0.887672539,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.276235 4685 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ebe8201fa859c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:08.818775452 +0000 UTC m=+1.295844244,LastTimestamp:2026-03-21 03:46:08.818775452 +0000 UTC m=+1.295844244,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.283255 4685 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ebe8201fa971b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:08.818779931 +0000 UTC m=+1.295848733,LastTimestamp:2026-03-21 03:46:08.818779931 +0000 UTC m=+1.295848733,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.289796 4685 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ebe8201fd09d9 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:08.818940377 +0000 UTC m=+1.296009209,LastTimestamp:2026-03-21 03:46:08.818940377 +0000 UTC m=+1.296009209,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.295608 4685 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ebe8201ff8381 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:08.819102593 +0000 UTC m=+1.296171425,LastTimestamp:2026-03-21 03:46:08.819102593 +0000 UTC m=+1.296171425,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.303245 4685 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189ebe820264e57d openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:08.825746813 +0000 UTC m=+1.302815615,LastTimestamp:2026-03-21 03:46:08.825746813 +0000 UTC m=+1.302815615,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.311307 4685 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ebe822702c304 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:09.440072452 +0000 UTC m=+1.917141254,LastTimestamp:2026-03-21 03:46:09.440072452 +0000 UTC m=+1.917141254,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.318535 4685 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ebe82270a7c30 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:09.440578608 +0000 UTC m=+1.917647400,LastTimestamp:2026-03-21 03:46:09.440578608 +0000 UTC m=+1.917647400,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.325395 4685 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ebe8227100d6b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:09.440943467 +0000 UTC m=+1.918012249,LastTimestamp:2026-03-21 03:46:09.440943467 +0000 UTC m=+1.918012249,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.332103 4685 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189ebe8227104c75 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:09.440959605 +0000 UTC m=+1.918028397,LastTimestamp:2026-03-21 03:46:09.440959605 +0000 UTC m=+1.918028397,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.339262 4685 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ebe82271c5482 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:09.441748098 +0000 UTC m=+1.918816890,LastTimestamp:2026-03-21 03:46:09.441748098 +0000 UTC m=+1.918816890,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.345775 4685 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ebe8227c9891d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:09.453099293 +0000 UTC m=+1.930168085,LastTimestamp:2026-03-21 03:46:09.453099293 +0000 UTC m=+1.930168085,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.353514 4685 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ebe8227e19ac6 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:09.454676678 +0000 UTC m=+1.931745460,LastTimestamp:2026-03-21 03:46:09.454676678 +0000 UTC m=+1.931745460,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.360514 4685 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189ebe8227f0fb64 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:09.455684452 +0000 UTC m=+1.932753244,LastTimestamp:2026-03-21 03:46:09.455684452 +0000 UTC m=+1.932753244,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.367382 4685 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ebe82283c82cd openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:09.460634317 +0000 UTC m=+1.937703109,LastTimestamp:2026-03-21 03:46:09.460634317 +0000 UTC m=+1.937703109,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.375087 4685 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ebe82285790c4 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:09.462407364 +0000 UTC m=+1.939476156,LastTimestamp:2026-03-21 03:46:09.462407364 +0000 UTC m=+1.939476156,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.382914 4685 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ebe82286d27ad openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:09.463822253 +0000 UTC m=+1.940891055,LastTimestamp:2026-03-21 03:46:09.463822253 +0000 UTC m=+1.940891055,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.390843 4685 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ebe823dff733f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:09.825731391 +0000 UTC m=+2.302800193,LastTimestamp:2026-03-21 03:46:09.825731391 +0000 UTC m=+2.302800193,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.397098 4685 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ebe823ed52bc9 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:09.839737801 +0000 UTC m=+2.316806623,LastTimestamp:2026-03-21 03:46:09.839737801 +0000 UTC m=+2.316806623,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.402488 4685 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ebe823ef59203 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:09.841861123 +0000 UTC m=+2.318929925,LastTimestamp:2026-03-21 03:46:09.841861123 +0000 UTC m=+2.318929925,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.409110 4685 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ebe824d156b31 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:10.078829361 +0000 UTC m=+2.555898153,LastTimestamp:2026-03-21 03:46:10.078829361 +0000 UTC m=+2.555898153,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.414320 4685 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ebe824ded78fc openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:10.092988668 +0000 UTC m=+2.570057460,LastTimestamp:2026-03-21 03:46:10.092988668 +0000 UTC m=+2.570057460,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.420947 4685 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ebe824e05591c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:10.094553372 +0000 UTC m=+2.571622164,LastTimestamp:2026-03-21 03:46:10.094553372 +0000 UTC m=+2.571622164,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.425598 4685 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ebe825baecfd5 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:10.323763157 +0000 UTC m=+2.800831989,LastTimestamp:2026-03-21 03:46:10.323763157 +0000 UTC m=+2.800831989,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.432141 4685 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189ebe825bda5696 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:10.326615702 +0000 UTC m=+2.803684534,LastTimestamp:2026-03-21 03:46:10.326615702 +0000 UTC m=+2.803684534,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.437370 4685 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ebe825c41b327 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:10.333389607 +0000 UTC m=+2.810458439,LastTimestamp:2026-03-21 03:46:10.333389607 +0000 UTC m=+2.810458439,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.442616 4685 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ebe825cf2ebc3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:10.345003971 +0000 UTC m=+2.822072753,LastTimestamp:2026-03-21 03:46:10.345003971 +0000 UTC m=+2.822072753,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.447206 4685 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ebe825dadbd5a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:10.357247322 +0000 UTC m=+2.834316114,LastTimestamp:2026-03-21 03:46:10.357247322 +0000 UTC m=+2.834316114,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.454749 4685 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ebe825f3303e9 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:10.382758889 +0000 UTC m=+2.859827681,LastTimestamp:2026-03-21 03:46:10.382758889 +0000 UTC m=+2.859827681,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.462225 4685 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ebe826b1f8e30 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:10.58281016 +0000 UTC m=+3.059878952,LastTimestamp:2026-03-21 03:46:10.58281016 +0000 UTC m=+3.059878952,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.466500 4685 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ebe826b3721f3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:10.584355315 +0000 UTC m=+3.061424097,LastTimestamp:2026-03-21 03:46:10.584355315 +0000 UTC m=+3.061424097,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.474075 4685 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189ebe826bb59b24 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:10.592643876 +0000 UTC m=+3.069712668,LastTimestamp:2026-03-21 03:46:10.592643876 +0000 UTC m=+3.069712668,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.481168 4685 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ebe826bfa6b75 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:10.597153653 +0000 UTC m=+3.074222445,LastTimestamp:2026-03-21 03:46:10.597153653 +0000 UTC m=+3.074222445,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.487585 4685 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ebe826c346102 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:10.600952066 +0000 UTC m=+3.078020868,LastTimestamp:2026-03-21 03:46:10.600952066 +0000 UTC m=+3.078020868,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.492187 4685 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ebe826cb5a24a openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:10.609422922 +0000 UTC m=+3.086491714,LastTimestamp:2026-03-21 03:46:10.609422922 +0000 UTC m=+3.086491714,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.496554 4685 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189ebe826cfa78c7 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:10.613934279 +0000 UTC m=+3.091003071,LastTimestamp:2026-03-21 03:46:10.613934279 +0000 UTC m=+3.091003071,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.500617 4685 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ebe826d8ff174 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:10.623730036 +0000 UTC m=+3.100798828,LastTimestamp:2026-03-21 03:46:10.623730036 +0000 UTC m=+3.100798828,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.504437 4685 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ebe826dbb2ab5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:10.626562741 +0000 UTC m=+3.103631533,LastTimestamp:2026-03-21 03:46:10.626562741 +0000 UTC m=+3.103631533,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.508744 4685 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ebe826ed9be9e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:10.645343902 +0000 UTC m=+3.122412694,LastTimestamp:2026-03-21 03:46:10.645343902 +0000 UTC m=+3.122412694,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.516686 4685 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ebe827a68a6f9 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:10.839258873 +0000 UTC m=+3.316327665,LastTimestamp:2026-03-21 03:46:10.839258873 +0000 UTC m=+3.316327665,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.525255 4685 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ebe827a6b2d86 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:10.83942439 +0000 UTC m=+3.316493192,LastTimestamp:2026-03-21 03:46:10.83942439 +0000 UTC m=+3.316493192,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.533238 4685 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ebe827b693dcb openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:10.856074699 +0000 UTC m=+3.333143511,LastTimestamp:2026-03-21 03:46:10.856074699 +0000 UTC m=+3.333143511,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.540327 4685 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ebe827b7c8972 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:10.85733925 +0000 UTC m=+3.334408052,LastTimestamp:2026-03-21 03:46:10.85733925 +0000 UTC m=+3.334408052,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.547416 4685 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ebe827b8bd6e6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:10.858342118 +0000 UTC m=+3.335410930,LastTimestamp:2026-03-21 03:46:10.858342118 +0000 UTC m=+3.335410930,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.554388 4685 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ebe827b9ac149 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:10.859319625 +0000 UTC m=+3.336388427,LastTimestamp:2026-03-21 03:46:10.859319625 +0000 UTC m=+3.336388427,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.562587 4685 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ebe8287fa2b4a openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:11.066899274 +0000 UTC m=+3.543968066,LastTimestamp:2026-03-21 03:46:11.066899274 +0000 UTC m=+3.543968066,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.569155 4685 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ebe82884f4a5b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:11.072477787 +0000 UTC m=+3.549546569,LastTimestamp:2026-03-21 03:46:11.072477787 +0000 UTC m=+3.549546569,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.578740 4685 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ebe8288a4effd openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:11.078090749 +0000 UTC m=+3.555159541,LastTimestamp:2026-03-21 03:46:11.078090749 +0000 UTC m=+3.555159541,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.585612 4685 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ebe82895ed05f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:11.090272351 +0000 UTC m=+3.567341143,LastTimestamp:2026-03-21 03:46:11.090272351 +0000 UTC m=+3.567341143,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.594236 4685 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ebe82897072a4 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:11.091428004 +0000 UTC m=+3.568496806,LastTimestamp:2026-03-21 03:46:11.091428004 +0000 UTC m=+3.568496806,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.604100 4685 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ebe82957fb5dc openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:11.293754844 +0000 UTC m=+3.770823636,LastTimestamp:2026-03-21 03:46:11.293754844 +0000 UTC m=+3.770823636,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.611550 4685 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ebe8296566d32 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:11.307826482 +0000 UTC m=+3.784895284,LastTimestamp:2026-03-21 03:46:11.307826482 +0000 UTC m=+3.784895284,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.616764 4685 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ebe82966cb4db openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:11.309286619 +0000 UTC m=+3.786355421,LastTimestamp:2026-03-21 03:46:11.309286619 +0000 UTC m=+3.786355421,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.622793 4685 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ebe8298fcef39 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:11.352293177 +0000 UTC m=+3.829361989,LastTimestamp:2026-03-21 03:46:11.352293177 +0000 UTC m=+3.829361989,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.625780 4685 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ebe82b0f9afc2 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:11.754733506 +0000 UTC m=+4.231802298,LastTimestamp:2026-03-21 03:46:11.754733506 +0000 UTC m=+4.231802298,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.630145 4685 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ebe82b0fbdd48 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:11.754876232 +0000 UTC m=+4.231945044,LastTimestamp:2026-03-21 03:46:11.754876232 +0000 UTC m=+4.231945044,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.635573 4685 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ebe82b28c7493 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:11.781129363 +0000 UTC m=+4.258198155,LastTimestamp:2026-03-21 03:46:11.781129363 +0000 UTC m=+4.258198155,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.642739 4685 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ebe82b294e422 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:11.78168221 +0000 UTC m=+4.258751002,LastTimestamp:2026-03-21 03:46:11.78168221 +0000 UTC m=+4.258751002,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.651112 4685 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ebe82d60a4978 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:12.376578424 +0000 UTC m=+4.853647226,LastTimestamp:2026-03-21 03:46:12.376578424 +0000 UTC m=+4.853647226,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.657654 4685 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ebe82e3e87872 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:12.60924325 +0000 UTC m=+5.086312072,LastTimestamp:2026-03-21 03:46:12.60924325 +0000 UTC m=+5.086312072,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.663486 4685 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ebe82e4abed96 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:12.622052758 +0000 UTC m=+5.099121580,LastTimestamp:2026-03-21 03:46:12.622052758 +0000 UTC m=+5.099121580,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.670573 4685 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ebe82e4c860b3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:12.623917235 +0000 UTC m=+5.100986067,LastTimestamp:2026-03-21 03:46:12.623917235 +0000 UTC m=+5.100986067,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.676895 4685 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ebe82f4e62bb0 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:12.8943052 +0000 UTC m=+5.371374022,LastTimestamp:2026-03-21 03:46:12.8943052 +0000 UTC m=+5.371374022,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.681917 4685 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ebe82f5efb8fd openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:12.911708413 +0000 UTC m=+5.388777215,LastTimestamp:2026-03-21 03:46:12.911708413 +0000 UTC m=+5.388777215,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.686144 4685 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ebe82f605b2a4 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:12.91314858 +0000 UTC m=+5.390217382,LastTimestamp:2026-03-21 03:46:12.91314858 +0000 UTC m=+5.390217382,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.691909 4685 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ebe83051a0e04 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:13.166140932 +0000 UTC m=+5.643209764,LastTimestamp:2026-03-21 03:46:13.166140932 +0000 UTC m=+5.643209764,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.698115 4685 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ebe83061b56eb openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:13.183002347 +0000 UTC m=+5.660071189,LastTimestamp:2026-03-21 03:46:13.183002347 +0000 UTC m=+5.660071189,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.703047 4685 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ebe830635149c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:13.184689308 +0000 UTC m=+5.661758100,LastTimestamp:2026-03-21 03:46:13.184689308 +0000 UTC m=+5.661758100,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.710803 4685 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ebe831697c2e6 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:13.45959191 +0000 UTC m=+5.936660742,LastTimestamp:2026-03-21 03:46:13.45959191 +0000 UTC m=+5.936660742,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.717550 4685 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ebe8317aa37f3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:13.477578739 +0000 UTC m=+5.954647571,LastTimestamp:2026-03-21 03:46:13.477578739 +0000 UTC m=+5.954647571,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.722009 4685 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ebe8317c1a320 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:13.479113504 +0000 UTC m=+5.956182326,LastTimestamp:2026-03-21 03:46:13.479113504 +0000 UTC m=+5.956182326,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.727114 4685 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ebe8328a756b8 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:13.76260268 +0000 UTC m=+6.239671512,LastTimestamp:2026-03-21 03:46:13.76260268 +0000 UTC m=+6.239671512,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.729656 4685 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ebe8329d4a6ed openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:13.782349549 +0000 UTC m=+6.259418551,LastTimestamp:2026-03-21 03:46:13.782349549 +0000 UTC m=+6.259418551,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.737416 4685 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 21 03:46:33 crc kubenswrapper[4685]: &Event{ObjectMeta:{kube-controller-manager-crc.189ebe852f71eb64 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 21 03:46:33 crc kubenswrapper[4685]: body: Mar 21 03:46:33 crc kubenswrapper[4685]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:22.4664769 +0000 UTC m=+14.943545702,LastTimestamp:2026-03-21 03:46:22.4664769 +0000 UTC m=+14.943545702,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 21 03:46:33 crc kubenswrapper[4685]: > Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.741722 4685 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ebe852f73c4a6 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:22.466598054 +0000 UTC m=+14.943666856,LastTimestamp:2026-03-21 03:46:22.466598054 +0000 UTC m=+14.943666856,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.745948 4685 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 21 03:46:33 crc kubenswrapper[4685]: &Event{ObjectMeta:{kube-apiserver-crc.189ebe8554aba82d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 21 03:46:33 crc kubenswrapper[4685]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 21 03:46:33 crc kubenswrapper[4685]: Mar 21 03:46:33 crc kubenswrapper[4685]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:23.091017773 +0000 UTC m=+15.568086575,LastTimestamp:2026-03-21 03:46:23.091017773 +0000 UTC m=+15.568086575,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 21 03:46:33 crc kubenswrapper[4685]: > Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.750064 4685 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ebe8554acc66f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:23.091091055 +0000 UTC m=+15.568159857,LastTimestamp:2026-03-21 03:46:23.091091055 +0000 UTC m=+15.568159857,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.754217 4685 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 21 03:46:33 crc kubenswrapper[4685]: &Event{ObjectMeta:{kube-apiserver-crc.189ebe85560ad571 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:ProbeError,Message:Readiness probe error: Get "https://192.168.126.11:17697/healthz": dial tcp 192.168.126.11:17697: connect: connection refused Mar 21 03:46:33 crc kubenswrapper[4685]: body: Mar 21 03:46:33 crc kubenswrapper[4685]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:23.114032497 +0000 UTC m=+15.591101319,LastTimestamp:2026-03-21 03:46:23.114032497 +0000 UTC m=+15.591101319,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 21 03:46:33 crc kubenswrapper[4685]: > Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.758286 4685 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ebe85560c9a30 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Unhealthy,Message:Readiness probe failed: Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:23.1141484 +0000 UTC m=+15.591217232,LastTimestamp:2026-03-21 03:46:23.1141484 +0000 UTC m=+15.591217232,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.764910 4685 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189ebe8554aba82d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 21 03:46:33 crc kubenswrapper[4685]: &Event{ObjectMeta:{kube-apiserver-crc.189ebe8554aba82d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 21 03:46:33 crc kubenswrapper[4685]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 21 03:46:33 crc kubenswrapper[4685]: Mar 21 03:46:33 crc kubenswrapper[4685]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:23.091017773 +0000 UTC m=+15.568086575,LastTimestamp:2026-03-21 03:46:23.117009263 +0000 UTC m=+15.594078065,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 21 03:46:33 crc kubenswrapper[4685]: > Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.769903 4685 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189ebe8554acc66f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ebe8554acc66f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:23.091091055 +0000 UTC m=+15.568159857,LastTimestamp:2026-03-21 03:46:23.117083215 +0000 UTC m=+15.594152017,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.776939 4685 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189ebe82966cb4db\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ebe82966cb4db openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:11.309286619 +0000 UTC m=+3.786355421,LastTimestamp:2026-03-21 03:46:23.424883032 +0000 UTC m=+15.901951834,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.783732 4685 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 21 03:46:33 crc kubenswrapper[4685]: &Event{ObjectMeta:{kube-controller-manager-crc.189ebe87837babd1 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 21 03:46:33 crc kubenswrapper[4685]: body: Mar 21 03:46:33 crc kubenswrapper[4685]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:32.466336721 +0000 UTC m=+24.943405543,LastTimestamp:2026-03-21 03:46:32.466336721 +0000 UTC m=+24.943405543,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 21 03:46:33 crc kubenswrapper[4685]: > Mar 21 03:46:33 crc kubenswrapper[4685]: E0321 03:46:33.787996 4685 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ebe87837d5164 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:32.466444644 +0000 UTC m=+24.943513476,LastTimestamp:2026-03-21 03:46:32.466444644 +0000 UTC m=+24.943513476,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:34 crc kubenswrapper[4685]: W0321 03:46:34.132115 4685 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 21 03:46:34 crc kubenswrapper[4685]: E0321 03:46:34.132181 4685 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 21 03:46:34 crc kubenswrapper[4685]: I0321 03:46:34.215452 4685 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 03:46:34 crc kubenswrapper[4685]: I0321 03:46:34.406001 4685 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 03:46:34 crc kubenswrapper[4685]: I0321 03:46:34.406355 4685 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 03:46:34 crc kubenswrapper[4685]: I0321 03:46:34.408484 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:46:34 crc kubenswrapper[4685]: I0321 03:46:34.408532 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:46:34 crc kubenswrapper[4685]: I0321 03:46:34.408551 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:46:34 crc kubenswrapper[4685]: I0321 03:46:34.409604 4685 scope.go:117] "RemoveContainer" containerID="1c24ed6bed5931f89f0ee754ab1515937df7ecbee9152bacfd444595e8360e9c" Mar 21 03:46:34 crc kubenswrapper[4685]: E0321 03:46:34.410044 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 21 03:46:35 crc kubenswrapper[4685]: I0321 03:46:35.218132 4685 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 03:46:36 crc kubenswrapper[4685]: I0321 03:46:36.217778 4685 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 03:46:36 crc kubenswrapper[4685]: I0321 03:46:36.492935 4685 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 03:46:36 crc kubenswrapper[4685]: I0321 03:46:36.495781 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:46:36 crc kubenswrapper[4685]: I0321 03:46:36.495909 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:46:36 crc kubenswrapper[4685]: I0321 03:46:36.495931 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:46:36 crc kubenswrapper[4685]: I0321 03:46:36.495976 4685 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 21 03:46:36 crc kubenswrapper[4685]: E0321 03:46:36.499513 4685 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 21 03:46:36 crc kubenswrapper[4685]: E0321 03:46:36.500353 4685 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 21 03:46:37 crc kubenswrapper[4685]: I0321 03:46:37.217488 4685 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 03:46:38 crc kubenswrapper[4685]: W0321 03:46:38.176736 4685 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 21 03:46:38 crc kubenswrapper[4685]: E0321 03:46:38.176815 4685 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 21 03:46:38 crc kubenswrapper[4685]: I0321 03:46:38.217946 4685 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 03:46:38 crc kubenswrapper[4685]: E0321 03:46:38.388409 4685 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 21 03:46:39 crc kubenswrapper[4685]: I0321 03:46:39.217611 4685 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 03:46:40 crc kubenswrapper[4685]: I0321 03:46:40.216482 4685 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 03:46:41 crc kubenswrapper[4685]: I0321 03:46:41.000797 4685 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:55934->192.168.126.11:10357: read: connection reset by peer" start-of-body= Mar 21 03:46:41 crc kubenswrapper[4685]: I0321 03:46:41.000942 4685 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:55934->192.168.126.11:10357: read: connection reset by peer" Mar 21 03:46:41 crc kubenswrapper[4685]: I0321 03:46:41.001042 4685 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 03:46:41 crc kubenswrapper[4685]: I0321 03:46:41.001269 4685 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 03:46:41 crc kubenswrapper[4685]: I0321 03:46:41.003469 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:46:41 crc kubenswrapper[4685]: I0321 03:46:41.003540 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:46:41 crc kubenswrapper[4685]: I0321 03:46:41.003562 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:46:41 crc kubenswrapper[4685]: I0321 03:46:41.004479 4685 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"229992b297c1eb9aa6f92e56505bdf819e392fb777636759549854330bf022ab"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 21 03:46:41 crc kubenswrapper[4685]: I0321 03:46:41.004831 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://229992b297c1eb9aa6f92e56505bdf819e392fb777636759549854330bf022ab" gracePeriod=30 Mar 21 03:46:41 crc kubenswrapper[4685]: E0321 03:46:41.016623 4685 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 21 03:46:41 crc kubenswrapper[4685]: &Event{ObjectMeta:{kube-controller-manager-crc.189ebe89802ef520 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": read tcp 192.168.126.11:55934->192.168.126.11:10357: read: connection reset by peer Mar 21 03:46:41 crc kubenswrapper[4685]: body: Mar 21 03:46:41 crc kubenswrapper[4685]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:41.00091216 +0000 UTC m=+33.477981012,LastTimestamp:2026-03-21 03:46:41.00091216 +0000 UTC m=+33.477981012,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 21 03:46:41 crc kubenswrapper[4685]: > Mar 21 03:46:41 crc kubenswrapper[4685]: E0321 03:46:41.024739 4685 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ebe89803038a7 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:55934->192.168.126.11:10357: read: connection reset by peer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:41.000994983 +0000 UTC m=+33.478063815,LastTimestamp:2026-03-21 03:46:41.000994983 +0000 UTC m=+33.478063815,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:41 crc kubenswrapper[4685]: E0321 03:46:41.033720 4685 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ebe89806a42a9 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:41.004798633 +0000 UTC m=+33.481867455,LastTimestamp:2026-03-21 03:46:41.004798633 +0000 UTC m=+33.481867455,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:41 crc kubenswrapper[4685]: I0321 03:46:41.216904 4685 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 03:46:41 crc kubenswrapper[4685]: I0321 03:46:41.502825 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 21 03:46:41 crc kubenswrapper[4685]: I0321 03:46:41.503484 4685 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="229992b297c1eb9aa6f92e56505bdf819e392fb777636759549854330bf022ab" exitCode=255 Mar 21 03:46:41 crc kubenswrapper[4685]: I0321 03:46:41.503592 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"229992b297c1eb9aa6f92e56505bdf819e392fb777636759549854330bf022ab"} Mar 21 03:46:41 crc kubenswrapper[4685]: E0321 03:46:41.542964 4685 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ebe82285790c4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ebe82285790c4 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:09.462407364 +0000 UTC m=+1.939476156,LastTimestamp:2026-03-21 03:46:41.534689166 +0000 UTC m=+34.011757998,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:41 crc kubenswrapper[4685]: E0321 03:46:41.822681 4685 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ebe823dff733f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ebe823dff733f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:09.825731391 +0000 UTC m=+2.302800193,LastTimestamp:2026-03-21 03:46:41.813562199 +0000 UTC m=+34.290631031,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:41 crc kubenswrapper[4685]: E0321 03:46:41.836487 4685 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ebe823ed52bc9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ebe823ed52bc9 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:46:09.839737801 +0000 UTC m=+2.316806623,LastTimestamp:2026-03-21 03:46:41.828659114 +0000 UTC m=+34.305727946,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:46:42 crc kubenswrapper[4685]: I0321 03:46:42.217605 4685 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 03:46:42 crc kubenswrapper[4685]: I0321 03:46:42.512246 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 21 03:46:42 crc kubenswrapper[4685]: I0321 03:46:42.513266 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d0be6102e44775f379b67814ed8f979bf81153bd68b519bc5c5e6cd2e3cb8169"} Mar 21 03:46:42 crc kubenswrapper[4685]: I0321 03:46:42.513460 4685 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 03:46:42 crc kubenswrapper[4685]: I0321 03:46:42.515033 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:46:42 crc kubenswrapper[4685]: I0321 03:46:42.515083 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:46:42 crc kubenswrapper[4685]: I0321 03:46:42.515096 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:46:43 crc kubenswrapper[4685]: I0321 03:46:43.218354 4685 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 03:46:43 crc kubenswrapper[4685]: I0321 03:46:43.500651 4685 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 03:46:43 crc kubenswrapper[4685]: I0321 03:46:43.502558 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:46:43 crc kubenswrapper[4685]: I0321 03:46:43.502627 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:46:43 crc kubenswrapper[4685]: I0321 03:46:43.502652 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:46:43 crc kubenswrapper[4685]: I0321 03:46:43.502708 4685 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 21 03:46:43 crc kubenswrapper[4685]: E0321 03:46:43.507913 4685 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 21 03:46:43 crc kubenswrapper[4685]: E0321 03:46:43.507953 4685 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 21 03:46:43 crc kubenswrapper[4685]: I0321 03:46:43.517132 4685 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 03:46:43 crc kubenswrapper[4685]: I0321 03:46:43.518376 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:46:43 crc kubenswrapper[4685]: I0321 03:46:43.518437 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:46:43 crc kubenswrapper[4685]: I0321 03:46:43.518466 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:46:44 crc kubenswrapper[4685]: I0321 03:46:44.217544 4685 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 03:46:45 crc kubenswrapper[4685]: I0321 03:46:45.218740 4685 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 03:46:46 crc kubenswrapper[4685]: I0321 03:46:46.217723 4685 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 03:46:47 crc kubenswrapper[4685]: I0321 03:46:47.218920 4685 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 03:46:48 crc kubenswrapper[4685]: I0321 03:46:48.079071 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 03:46:48 crc kubenswrapper[4685]: I0321 03:46:48.079690 4685 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 03:46:48 crc kubenswrapper[4685]: I0321 03:46:48.081508 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:46:48 crc kubenswrapper[4685]: I0321 03:46:48.081737 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:46:48 crc kubenswrapper[4685]: I0321 03:46:48.081914 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:46:48 crc kubenswrapper[4685]: I0321 03:46:48.218532 4685 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 03:46:48 crc kubenswrapper[4685]: E0321 03:46:48.388640 4685 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 21 03:46:49 crc kubenswrapper[4685]: I0321 03:46:49.217784 4685 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 03:46:49 crc kubenswrapper[4685]: I0321 03:46:49.300533 4685 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 03:46:49 crc kubenswrapper[4685]: I0321 03:46:49.302275 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:46:49 crc kubenswrapper[4685]: I0321 03:46:49.302329 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:46:49 crc kubenswrapper[4685]: I0321 03:46:49.302347 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:46:49 crc kubenswrapper[4685]: I0321 03:46:49.303389 4685 scope.go:117] "RemoveContainer" containerID="1c24ed6bed5931f89f0ee754ab1515937df7ecbee9152bacfd444595e8360e9c" Mar 21 03:46:49 crc kubenswrapper[4685]: I0321 03:46:49.465280 4685 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 03:46:49 crc kubenswrapper[4685]: I0321 03:46:49.465496 4685 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 03:46:49 crc kubenswrapper[4685]: I0321 03:46:49.468967 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:46:49 crc kubenswrapper[4685]: I0321 03:46:49.469077 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:46:49 crc kubenswrapper[4685]: I0321 03:46:49.469094 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:46:49 crc kubenswrapper[4685]: I0321 03:46:49.471624 4685 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 03:46:49 crc kubenswrapper[4685]: I0321 03:46:49.534816 4685 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 03:46:49 crc kubenswrapper[4685]: I0321 03:46:49.536223 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:46:49 crc kubenswrapper[4685]: I0321 03:46:49.536304 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:46:49 crc kubenswrapper[4685]: I0321 03:46:49.536331 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:46:50 crc kubenswrapper[4685]: I0321 03:46:50.217638 4685 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 03:46:50 crc kubenswrapper[4685]: I0321 03:46:50.508867 4685 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 03:46:50 crc kubenswrapper[4685]: I0321 03:46:50.510479 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:46:50 crc kubenswrapper[4685]: I0321 03:46:50.510611 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:46:50 crc kubenswrapper[4685]: I0321 03:46:50.510700 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:46:50 crc kubenswrapper[4685]: I0321 03:46:50.510857 4685 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 21 03:46:50 crc kubenswrapper[4685]: E0321 03:46:50.516094 4685 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 21 03:46:50 crc kubenswrapper[4685]: E0321 03:46:50.516133 4685 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 21 03:46:50 crc kubenswrapper[4685]: I0321 03:46:50.539683 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 21 03:46:50 crc kubenswrapper[4685]: I0321 03:46:50.541450 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a2a75c9df2a4e1b36d26acbbc5b2b7cb6e790962cd9d6787be87dfaa98dc0a77"} Mar 21 03:46:50 crc kubenswrapper[4685]: I0321 03:46:50.541610 4685 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 03:46:50 crc kubenswrapper[4685]: I0321 03:46:50.542534 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:46:50 crc kubenswrapper[4685]: I0321 03:46:50.542605 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:46:50 crc kubenswrapper[4685]: I0321 03:46:50.542622 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:46:51 crc kubenswrapper[4685]: I0321 03:46:51.216608 4685 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 03:46:51 crc kubenswrapper[4685]: I0321 03:46:51.550443 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 21 03:46:51 crc kubenswrapper[4685]: I0321 03:46:51.551176 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 21 03:46:51 crc kubenswrapper[4685]: I0321 03:46:51.553470 4685 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a2a75c9df2a4e1b36d26acbbc5b2b7cb6e790962cd9d6787be87dfaa98dc0a77" exitCode=255 Mar 21 03:46:51 crc kubenswrapper[4685]: I0321 03:46:51.553530 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"a2a75c9df2a4e1b36d26acbbc5b2b7cb6e790962cd9d6787be87dfaa98dc0a77"} Mar 21 03:46:51 crc kubenswrapper[4685]: I0321 03:46:51.553589 4685 scope.go:117] "RemoveContainer" containerID="1c24ed6bed5931f89f0ee754ab1515937df7ecbee9152bacfd444595e8360e9c" Mar 21 03:46:51 crc kubenswrapper[4685]: I0321 03:46:51.553790 4685 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 03:46:51 crc kubenswrapper[4685]: I0321 03:46:51.555354 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:46:51 crc kubenswrapper[4685]: I0321 03:46:51.555391 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:46:51 crc kubenswrapper[4685]: I0321 03:46:51.555409 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:46:51 crc kubenswrapper[4685]: I0321 03:46:51.556598 4685 scope.go:117] "RemoveContainer" containerID="a2a75c9df2a4e1b36d26acbbc5b2b7cb6e790962cd9d6787be87dfaa98dc0a77" Mar 21 03:46:51 crc kubenswrapper[4685]: E0321 03:46:51.557021 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 21 03:46:52 crc kubenswrapper[4685]: I0321 03:46:52.215566 4685 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 03:46:52 crc kubenswrapper[4685]: I0321 03:46:52.559359 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 21 03:46:53 crc kubenswrapper[4685]: I0321 03:46:53.112777 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 03:46:53 crc kubenswrapper[4685]: I0321 03:46:53.113332 4685 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 03:46:53 crc kubenswrapper[4685]: I0321 03:46:53.114823 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:46:53 crc kubenswrapper[4685]: I0321 03:46:53.114991 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:46:53 crc kubenswrapper[4685]: I0321 03:46:53.115077 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:46:53 crc kubenswrapper[4685]: I0321 03:46:53.115962 4685 scope.go:117] "RemoveContainer" containerID="a2a75c9df2a4e1b36d26acbbc5b2b7cb6e790962cd9d6787be87dfaa98dc0a77" Mar 21 03:46:53 crc kubenswrapper[4685]: E0321 03:46:53.116265 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 21 03:46:53 crc kubenswrapper[4685]: I0321 03:46:53.216686 4685 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 03:46:53 crc kubenswrapper[4685]: W0321 03:46:53.323280 4685 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 21 03:46:53 crc kubenswrapper[4685]: E0321 03:46:53.323360 4685 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 21 03:46:53 crc kubenswrapper[4685]: W0321 03:46:53.527606 4685 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 21 03:46:53 crc kubenswrapper[4685]: E0321 03:46:53.527685 4685 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 21 03:46:54 crc kubenswrapper[4685]: I0321 03:46:54.214646 4685 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 03:46:54 crc kubenswrapper[4685]: I0321 03:46:54.406186 4685 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 03:46:54 crc kubenswrapper[4685]: I0321 03:46:54.406542 4685 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 03:46:54 crc kubenswrapper[4685]: I0321 03:46:54.408168 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:46:54 crc kubenswrapper[4685]: I0321 03:46:54.408290 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:46:54 crc kubenswrapper[4685]: I0321 03:46:54.408364 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:46:54 crc kubenswrapper[4685]: I0321 03:46:54.409046 4685 scope.go:117] "RemoveContainer" containerID="a2a75c9df2a4e1b36d26acbbc5b2b7cb6e790962cd9d6787be87dfaa98dc0a77" Mar 21 03:46:54 crc kubenswrapper[4685]: E0321 03:46:54.409328 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 21 03:46:55 crc kubenswrapper[4685]: I0321 03:46:55.217819 4685 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 03:46:56 crc kubenswrapper[4685]: W0321 03:46:56.082539 4685 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 21 03:46:56 crc kubenswrapper[4685]: E0321 03:46:56.082624 4685 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 21 03:46:56 crc kubenswrapper[4685]: I0321 03:46:56.216473 4685 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 03:46:57 crc kubenswrapper[4685]: I0321 03:46:57.219972 4685 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 03:46:57 crc kubenswrapper[4685]: I0321 03:46:57.516586 4685 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 03:46:57 crc kubenswrapper[4685]: I0321 03:46:57.518702 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:46:57 crc kubenswrapper[4685]: I0321 03:46:57.518755 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:46:57 crc kubenswrapper[4685]: I0321 03:46:57.518773 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:46:57 crc kubenswrapper[4685]: I0321 03:46:57.518814 4685 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 21 03:46:57 crc kubenswrapper[4685]: E0321 03:46:57.524798 4685 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 21 03:46:57 crc kubenswrapper[4685]: E0321 03:46:57.524914 4685 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 21 03:46:58 crc kubenswrapper[4685]: I0321 03:46:58.084511 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 03:46:58 crc kubenswrapper[4685]: I0321 03:46:58.084719 4685 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 03:46:58 crc kubenswrapper[4685]: I0321 03:46:58.086679 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:46:58 crc kubenswrapper[4685]: I0321 03:46:58.086717 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:46:58 crc kubenswrapper[4685]: I0321 03:46:58.086729 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:46:58 crc kubenswrapper[4685]: I0321 03:46:58.217179 4685 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 03:46:58 crc kubenswrapper[4685]: E0321 03:46:58.388906 4685 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 21 03:46:59 crc kubenswrapper[4685]: I0321 03:46:59.223467 4685 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 03:46:59 crc kubenswrapper[4685]: I0321 03:46:59.925405 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 21 03:46:59 crc kubenswrapper[4685]: I0321 03:46:59.925634 4685 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 03:46:59 crc kubenswrapper[4685]: I0321 03:46:59.927954 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:46:59 crc kubenswrapper[4685]: I0321 03:46:59.928004 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:46:59 crc kubenswrapper[4685]: I0321 03:46:59.928024 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:00 crc kubenswrapper[4685]: W0321 03:47:00.073112 4685 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 21 03:47:00 crc kubenswrapper[4685]: E0321 03:47:00.073175 4685 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 21 03:47:00 crc kubenswrapper[4685]: I0321 03:47:00.218319 4685 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 03:47:01 crc kubenswrapper[4685]: I0321 03:47:01.217919 4685 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 03:47:02 crc kubenswrapper[4685]: I0321 03:47:02.217491 4685 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 03:47:03 crc kubenswrapper[4685]: I0321 03:47:03.217586 4685 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 03:47:04 crc kubenswrapper[4685]: I0321 03:47:04.217453 4685 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 03:47:04 crc kubenswrapper[4685]: I0321 03:47:04.526018 4685 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 03:47:04 crc kubenswrapper[4685]: I0321 03:47:04.527700 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:04 crc kubenswrapper[4685]: I0321 03:47:04.527787 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:04 crc kubenswrapper[4685]: I0321 03:47:04.527810 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:04 crc kubenswrapper[4685]: I0321 03:47:04.527885 4685 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 21 03:47:04 crc kubenswrapper[4685]: E0321 03:47:04.533660 4685 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 21 03:47:04 crc kubenswrapper[4685]: E0321 03:47:04.533777 4685 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 21 03:47:05 crc kubenswrapper[4685]: I0321 03:47:05.217715 4685 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 03:47:06 crc kubenswrapper[4685]: I0321 03:47:06.217623 4685 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 03:47:07 crc kubenswrapper[4685]: I0321 03:47:07.217713 4685 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 03:47:07 crc kubenswrapper[4685]: I0321 03:47:07.300828 4685 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 03:47:07 crc kubenswrapper[4685]: I0321 03:47:07.302783 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:07 crc kubenswrapper[4685]: I0321 03:47:07.302898 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:07 crc kubenswrapper[4685]: I0321 03:47:07.302921 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:07 crc kubenswrapper[4685]: I0321 03:47:07.304008 4685 scope.go:117] "RemoveContainer" containerID="a2a75c9df2a4e1b36d26acbbc5b2b7cb6e790962cd9d6787be87dfaa98dc0a77" Mar 21 03:47:07 crc kubenswrapper[4685]: E0321 03:47:07.304371 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 21 03:47:08 crc kubenswrapper[4685]: I0321 03:47:08.218452 4685 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 03:47:08 crc kubenswrapper[4685]: E0321 03:47:08.389986 4685 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 21 03:47:09 crc kubenswrapper[4685]: I0321 03:47:09.217777 4685 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 03:47:10 crc kubenswrapper[4685]: I0321 03:47:10.218422 4685 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 03:47:11 crc kubenswrapper[4685]: I0321 03:47:11.218269 4685 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 03:47:11 crc kubenswrapper[4685]: I0321 03:47:11.534656 4685 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 03:47:11 crc kubenswrapper[4685]: I0321 03:47:11.536728 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:11 crc kubenswrapper[4685]: I0321 03:47:11.536801 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:11 crc kubenswrapper[4685]: I0321 03:47:11.536819 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:11 crc kubenswrapper[4685]: I0321 03:47:11.536894 4685 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 21 03:47:11 crc kubenswrapper[4685]: E0321 03:47:11.542438 4685 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 21 03:47:11 crc kubenswrapper[4685]: E0321 03:47:11.542457 4685 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 21 03:47:12 crc kubenswrapper[4685]: I0321 03:47:12.216285 4685 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 03:47:13 crc kubenswrapper[4685]: I0321 03:47:13.052204 4685 csr.go:261] certificate signing request csr-gvf69 is approved, waiting to be issued Mar 21 03:47:13 crc kubenswrapper[4685]: I0321 03:47:13.066162 4685 csr.go:257] certificate signing request csr-gvf69 is issued Mar 21 03:47:13 crc kubenswrapper[4685]: I0321 03:47:13.141505 4685 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 21 03:47:14 crc kubenswrapper[4685]: I0321 03:47:14.042059 4685 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 21 03:47:14 crc kubenswrapper[4685]: I0321 03:47:14.068528 4685 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-30 03:37:06.05979734 +0000 UTC Mar 21 03:47:14 crc kubenswrapper[4685]: I0321 03:47:14.068605 4685 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6815h49m51.991198879s for next certificate rotation Mar 21 03:47:14 crc kubenswrapper[4685]: I0321 03:47:14.299962 4685 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 03:47:14 crc kubenswrapper[4685]: I0321 03:47:14.301159 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:14 crc kubenswrapper[4685]: I0321 03:47:14.301202 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:14 crc kubenswrapper[4685]: I0321 03:47:14.301214 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:18 crc kubenswrapper[4685]: I0321 03:47:18.301037 4685 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 03:47:18 crc kubenswrapper[4685]: I0321 03:47:18.303505 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:18 crc kubenswrapper[4685]: I0321 03:47:18.303585 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:18 crc kubenswrapper[4685]: I0321 03:47:18.303609 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:18 crc kubenswrapper[4685]: I0321 03:47:18.304715 4685 scope.go:117] "RemoveContainer" containerID="a2a75c9df2a4e1b36d26acbbc5b2b7cb6e790962cd9d6787be87dfaa98dc0a77" Mar 21 03:47:18 crc kubenswrapper[4685]: E0321 03:47:18.390141 4685 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 21 03:47:18 crc kubenswrapper[4685]: I0321 03:47:18.543403 4685 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 03:47:18 crc kubenswrapper[4685]: I0321 03:47:18.544828 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:18 crc kubenswrapper[4685]: I0321 03:47:18.544927 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:18 crc kubenswrapper[4685]: I0321 03:47:18.544942 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:18 crc kubenswrapper[4685]: I0321 03:47:18.545107 4685 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 21 03:47:18 crc kubenswrapper[4685]: I0321 03:47:18.556358 4685 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 21 03:47:18 crc kubenswrapper[4685]: I0321 03:47:18.556811 4685 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 21 03:47:18 crc kubenswrapper[4685]: E0321 03:47:18.556867 4685 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 21 03:47:18 crc kubenswrapper[4685]: I0321 03:47:18.559961 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:18 crc kubenswrapper[4685]: I0321 03:47:18.560007 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:18 crc kubenswrapper[4685]: I0321 03:47:18.560025 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:18 crc kubenswrapper[4685]: I0321 03:47:18.560048 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:18 crc kubenswrapper[4685]: I0321 03:47:18.560068 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:18Z","lastTransitionTime":"2026-03-21T03:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:18 crc kubenswrapper[4685]: E0321 03:47:18.573489 4685 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:47:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:47:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:47:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:47:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8bd8455e-cd7f-4a01-9ab2-39696fd22c82\\\",\\\"systemUUID\\\":\\\"9e08e2bc-a5ba-4ecf-a5ec-5e30e6b4a1fa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 03:47:18 crc kubenswrapper[4685]: I0321 03:47:18.582923 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:18 crc kubenswrapper[4685]: I0321 03:47:18.582963 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:18 crc kubenswrapper[4685]: I0321 03:47:18.582984 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:18 crc kubenswrapper[4685]: I0321 03:47:18.583002 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:18 crc kubenswrapper[4685]: I0321 03:47:18.583015 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:18Z","lastTransitionTime":"2026-03-21T03:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:18 crc kubenswrapper[4685]: E0321 03:47:18.591245 4685 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:47:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:47:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:47:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:47:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8bd8455e-cd7f-4a01-9ab2-39696fd22c82\\\",\\\"systemUUID\\\":\\\"9e08e2bc-a5ba-4ecf-a5ec-5e30e6b4a1fa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 03:47:18 crc kubenswrapper[4685]: I0321 03:47:18.598276 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:18 crc kubenswrapper[4685]: I0321 03:47:18.598315 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:18 crc kubenswrapper[4685]: I0321 03:47:18.598328 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:18 crc kubenswrapper[4685]: I0321 03:47:18.598344 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:18 crc kubenswrapper[4685]: I0321 03:47:18.598356 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:18Z","lastTransitionTime":"2026-03-21T03:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:18 crc kubenswrapper[4685]: E0321 03:47:18.606892 4685 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:47:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:47:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:47:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:47:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8bd8455e-cd7f-4a01-9ab2-39696fd22c82\\\",\\\"systemUUID\\\":\\\"9e08e2bc-a5ba-4ecf-a5ec-5e30e6b4a1fa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 03:47:18 crc kubenswrapper[4685]: I0321 03:47:18.615517 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:18 crc kubenswrapper[4685]: I0321 03:47:18.615570 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:18 crc kubenswrapper[4685]: I0321 03:47:18.615590 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:18 crc kubenswrapper[4685]: I0321 03:47:18.615614 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:18 crc kubenswrapper[4685]: I0321 03:47:18.615631 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:18Z","lastTransitionTime":"2026-03-21T03:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:18 crc kubenswrapper[4685]: E0321 03:47:18.628754 4685 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:47:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:47:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:47:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:47:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8bd8455e-cd7f-4a01-9ab2-39696fd22c82\\\",\\\"systemUUID\\\":\\\"9e08e2bc-a5ba-4ecf-a5ec-5e30e6b4a1fa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 03:47:18 crc kubenswrapper[4685]: E0321 03:47:18.628883 4685 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 21 03:47:18 crc kubenswrapper[4685]: E0321 03:47:18.628916 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:18 crc kubenswrapper[4685]: E0321 03:47:18.729570 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:18 crc kubenswrapper[4685]: I0321 03:47:18.806226 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 21 03:47:18 crc kubenswrapper[4685]: I0321 03:47:18.807767 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c896022c1e0726ccc5a54da9c432a0ff4082158ad76abcea60c0a1427c69134b"} Mar 21 03:47:18 crc kubenswrapper[4685]: I0321 03:47:18.807976 4685 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 03:47:18 crc kubenswrapper[4685]: I0321 03:47:18.809086 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:18 crc kubenswrapper[4685]: I0321 03:47:18.809159 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:18 crc kubenswrapper[4685]: I0321 03:47:18.809182 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:18 crc kubenswrapper[4685]: E0321 03:47:18.830655 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:18 crc kubenswrapper[4685]: E0321 03:47:18.930966 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:19 crc kubenswrapper[4685]: E0321 03:47:19.031397 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:19 crc kubenswrapper[4685]: E0321 03:47:19.131929 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:19 crc kubenswrapper[4685]: E0321 03:47:19.233129 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:19 crc kubenswrapper[4685]: E0321 03:47:19.333892 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:19 crc kubenswrapper[4685]: E0321 03:47:19.434029 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:19 crc kubenswrapper[4685]: E0321 03:47:19.534567 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:19 crc kubenswrapper[4685]: E0321 03:47:19.635303 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:19 crc kubenswrapper[4685]: E0321 03:47:19.736171 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:19 crc kubenswrapper[4685]: I0321 03:47:19.811286 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 21 03:47:19 crc kubenswrapper[4685]: I0321 03:47:19.811965 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 21 03:47:19 crc kubenswrapper[4685]: I0321 03:47:19.813365 4685 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c896022c1e0726ccc5a54da9c432a0ff4082158ad76abcea60c0a1427c69134b" exitCode=255 Mar 21 03:47:19 crc kubenswrapper[4685]: I0321 03:47:19.813416 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"c896022c1e0726ccc5a54da9c432a0ff4082158ad76abcea60c0a1427c69134b"} Mar 21 03:47:19 crc kubenswrapper[4685]: I0321 03:47:19.813469 4685 scope.go:117] "RemoveContainer" containerID="a2a75c9df2a4e1b36d26acbbc5b2b7cb6e790962cd9d6787be87dfaa98dc0a77" Mar 21 03:47:19 crc kubenswrapper[4685]: I0321 03:47:19.813704 4685 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 03:47:19 crc kubenswrapper[4685]: I0321 03:47:19.815160 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:19 crc kubenswrapper[4685]: I0321 03:47:19.815208 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:19 crc kubenswrapper[4685]: I0321 03:47:19.815225 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:19 crc kubenswrapper[4685]: I0321 03:47:19.816065 4685 scope.go:117] "RemoveContainer" containerID="c896022c1e0726ccc5a54da9c432a0ff4082158ad76abcea60c0a1427c69134b" Mar 21 03:47:19 crc kubenswrapper[4685]: E0321 03:47:19.816318 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 21 03:47:19 crc kubenswrapper[4685]: E0321 03:47:19.836239 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:19 crc kubenswrapper[4685]: I0321 03:47:19.840265 4685 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 21 03:47:19 crc kubenswrapper[4685]: E0321 03:47:19.937748 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:20 crc kubenswrapper[4685]: E0321 03:47:20.038777 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:20 crc kubenswrapper[4685]: E0321 03:47:20.139886 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:20 crc kubenswrapper[4685]: E0321 03:47:20.240444 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:20 crc kubenswrapper[4685]: E0321 03:47:20.340866 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:20 crc kubenswrapper[4685]: E0321 03:47:20.441744 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:20 crc kubenswrapper[4685]: E0321 03:47:20.542279 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:20 crc kubenswrapper[4685]: E0321 03:47:20.643473 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:20 crc kubenswrapper[4685]: E0321 03:47:20.744304 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:20 crc kubenswrapper[4685]: I0321 03:47:20.819733 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 21 03:47:20 crc kubenswrapper[4685]: E0321 03:47:20.844431 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:20 crc kubenswrapper[4685]: E0321 03:47:20.944906 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:21 crc kubenswrapper[4685]: E0321 03:47:21.045055 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:21 crc kubenswrapper[4685]: E0321 03:47:21.146122 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:21 crc kubenswrapper[4685]: E0321 03:47:21.246824 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:21 crc kubenswrapper[4685]: E0321 03:47:21.347795 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:21 crc kubenswrapper[4685]: E0321 03:47:21.448300 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:21 crc kubenswrapper[4685]: E0321 03:47:21.548721 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:21 crc kubenswrapper[4685]: E0321 03:47:21.649857 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:21 crc kubenswrapper[4685]: E0321 03:47:21.750615 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:21 crc kubenswrapper[4685]: E0321 03:47:21.851368 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:21 crc kubenswrapper[4685]: E0321 03:47:21.952375 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:22 crc kubenswrapper[4685]: E0321 03:47:22.052797 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:22 crc kubenswrapper[4685]: E0321 03:47:22.153722 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:22 crc kubenswrapper[4685]: I0321 03:47:22.179145 4685 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 21 03:47:22 crc kubenswrapper[4685]: E0321 03:47:22.254477 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:22 crc kubenswrapper[4685]: E0321 03:47:22.355624 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:22 crc kubenswrapper[4685]: E0321 03:47:22.456431 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:22 crc kubenswrapper[4685]: E0321 03:47:22.556877 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:22 crc kubenswrapper[4685]: E0321 03:47:22.657533 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:22 crc kubenswrapper[4685]: E0321 03:47:22.757792 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:22 crc kubenswrapper[4685]: E0321 03:47:22.858345 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:22 crc kubenswrapper[4685]: E0321 03:47:22.959449 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:23 crc kubenswrapper[4685]: E0321 03:47:23.060076 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:23 crc kubenswrapper[4685]: I0321 03:47:23.113000 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 03:47:23 crc kubenswrapper[4685]: I0321 03:47:23.113337 4685 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 03:47:23 crc kubenswrapper[4685]: I0321 03:47:23.115935 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:23 crc kubenswrapper[4685]: I0321 03:47:23.116012 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:23 crc kubenswrapper[4685]: I0321 03:47:23.116039 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:23 crc kubenswrapper[4685]: I0321 03:47:23.117382 4685 scope.go:117] "RemoveContainer" containerID="c896022c1e0726ccc5a54da9c432a0ff4082158ad76abcea60c0a1427c69134b" Mar 21 03:47:23 crc kubenswrapper[4685]: E0321 03:47:23.117956 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 21 03:47:23 crc kubenswrapper[4685]: E0321 03:47:23.160565 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:23 crc kubenswrapper[4685]: E0321 03:47:23.261077 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:23 crc kubenswrapper[4685]: E0321 03:47:23.361572 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:23 crc kubenswrapper[4685]: E0321 03:47:23.462515 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:23 crc kubenswrapper[4685]: E0321 03:47:23.562698 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:23 crc kubenswrapper[4685]: E0321 03:47:23.662965 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:23 crc kubenswrapper[4685]: E0321 03:47:23.763739 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:23 crc kubenswrapper[4685]: E0321 03:47:23.864200 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:23 crc kubenswrapper[4685]: E0321 03:47:23.964712 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:24 crc kubenswrapper[4685]: E0321 03:47:24.065098 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:24 crc kubenswrapper[4685]: E0321 03:47:24.165399 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:24 crc kubenswrapper[4685]: E0321 03:47:24.265884 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:24 crc kubenswrapper[4685]: E0321 03:47:24.366097 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:24 crc kubenswrapper[4685]: I0321 03:47:24.405739 4685 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 03:47:24 crc kubenswrapper[4685]: I0321 03:47:24.406029 4685 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 03:47:24 crc kubenswrapper[4685]: I0321 03:47:24.407910 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:24 crc kubenswrapper[4685]: I0321 03:47:24.407995 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:24 crc kubenswrapper[4685]: I0321 03:47:24.408022 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:24 crc kubenswrapper[4685]: I0321 03:47:24.409431 4685 scope.go:117] "RemoveContainer" containerID="c896022c1e0726ccc5a54da9c432a0ff4082158ad76abcea60c0a1427c69134b" Mar 21 03:47:24 crc kubenswrapper[4685]: E0321 03:47:24.409900 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 21 03:47:24 crc kubenswrapper[4685]: E0321 03:47:24.466300 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:24 crc kubenswrapper[4685]: E0321 03:47:24.566689 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:24 crc kubenswrapper[4685]: E0321 03:47:24.668342 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:24 crc kubenswrapper[4685]: E0321 03:47:24.769955 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:24 crc kubenswrapper[4685]: E0321 03:47:24.870508 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:24 crc kubenswrapper[4685]: E0321 03:47:24.971633 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:25 crc kubenswrapper[4685]: E0321 03:47:25.072504 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:25 crc kubenswrapper[4685]: E0321 03:47:25.173431 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:25 crc kubenswrapper[4685]: E0321 03:47:25.274685 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:25 crc kubenswrapper[4685]: E0321 03:47:25.375661 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:25 crc kubenswrapper[4685]: E0321 03:47:25.476171 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:25 crc kubenswrapper[4685]: E0321 03:47:25.576630 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:25 crc kubenswrapper[4685]: E0321 03:47:25.677479 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:25 crc kubenswrapper[4685]: E0321 03:47:25.778583 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:25 crc kubenswrapper[4685]: E0321 03:47:25.879345 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:25 crc kubenswrapper[4685]: E0321 03:47:25.980029 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:26 crc kubenswrapper[4685]: E0321 03:47:26.080726 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:26 crc kubenswrapper[4685]: E0321 03:47:26.180913 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:26 crc kubenswrapper[4685]: E0321 03:47:26.282040 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:26 crc kubenswrapper[4685]: E0321 03:47:26.382969 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:26 crc kubenswrapper[4685]: E0321 03:47:26.484446 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:26 crc kubenswrapper[4685]: E0321 03:47:26.584956 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:26 crc kubenswrapper[4685]: E0321 03:47:26.685797 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:26 crc kubenswrapper[4685]: E0321 03:47:26.786777 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:26 crc kubenswrapper[4685]: E0321 03:47:26.887940 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:26 crc kubenswrapper[4685]: E0321 03:47:26.988100 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:27 crc kubenswrapper[4685]: E0321 03:47:27.088993 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:27 crc kubenswrapper[4685]: E0321 03:47:27.189917 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:27 crc kubenswrapper[4685]: E0321 03:47:27.290614 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:27 crc kubenswrapper[4685]: E0321 03:47:27.391442 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:27 crc kubenswrapper[4685]: E0321 03:47:27.491810 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:27 crc kubenswrapper[4685]: E0321 03:47:27.591979 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:27 crc kubenswrapper[4685]: E0321 03:47:27.693137 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:27 crc kubenswrapper[4685]: E0321 03:47:27.793487 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:27 crc kubenswrapper[4685]: E0321 03:47:27.894563 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:27 crc kubenswrapper[4685]: E0321 03:47:27.994712 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:28 crc kubenswrapper[4685]: E0321 03:47:28.095238 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:28 crc kubenswrapper[4685]: E0321 03:47:28.196056 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:28 crc kubenswrapper[4685]: E0321 03:47:28.297144 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:28 crc kubenswrapper[4685]: I0321 03:47:28.300682 4685 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 03:47:28 crc kubenswrapper[4685]: I0321 03:47:28.302417 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:28 crc kubenswrapper[4685]: I0321 03:47:28.302454 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:28 crc kubenswrapper[4685]: I0321 03:47:28.302467 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:28 crc kubenswrapper[4685]: E0321 03:47:28.390308 4685 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 21 03:47:28 crc kubenswrapper[4685]: E0321 03:47:28.397378 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:28 crc kubenswrapper[4685]: E0321 03:47:28.498227 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:28 crc kubenswrapper[4685]: E0321 03:47:28.599283 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:28 crc kubenswrapper[4685]: E0321 03:47:28.699516 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:28 crc kubenswrapper[4685]: E0321 03:47:28.800268 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:28 crc kubenswrapper[4685]: E0321 03:47:28.901005 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:29 crc kubenswrapper[4685]: E0321 03:47:29.002126 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:29 crc kubenswrapper[4685]: E0321 03:47:29.025606 4685 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 21 03:47:29 crc kubenswrapper[4685]: I0321 03:47:29.031651 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:29 crc kubenswrapper[4685]: I0321 03:47:29.031711 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:29 crc kubenswrapper[4685]: I0321 03:47:29.031730 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:29 crc kubenswrapper[4685]: I0321 03:47:29.031755 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:29 crc kubenswrapper[4685]: I0321 03:47:29.031775 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:29Z","lastTransitionTime":"2026-03-21T03:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:29 crc kubenswrapper[4685]: E0321 03:47:29.049818 4685 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:47:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:47:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:47:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:47:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8bd8455e-cd7f-4a01-9ab2-39696fd22c82\\\",\\\"systemUUID\\\":\\\"9e08e2bc-a5ba-4ecf-a5ec-5e30e6b4a1fa\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 03:47:29 crc kubenswrapper[4685]: I0321 03:47:29.054588 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:29 crc kubenswrapper[4685]: I0321 03:47:29.054644 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:29 crc kubenswrapper[4685]: I0321 03:47:29.054657 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:29 crc kubenswrapper[4685]: I0321 03:47:29.054679 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:29 crc kubenswrapper[4685]: I0321 03:47:29.054693 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:29Z","lastTransitionTime":"2026-03-21T03:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:29 crc kubenswrapper[4685]: E0321 03:47:29.072244 4685 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:47:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:47:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:47:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:47:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8bd8455e-cd7f-4a01-9ab2-39696fd22c82\\\",\\\"systemUUID\\\":\\\"9e08e2bc-a5ba-4ecf-a5ec-5e30e6b4a1fa\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 03:47:29 crc kubenswrapper[4685]: I0321 03:47:29.078032 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:29 crc kubenswrapper[4685]: I0321 03:47:29.078104 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:29 crc kubenswrapper[4685]: I0321 03:47:29.078121 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:29 crc kubenswrapper[4685]: I0321 03:47:29.078150 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:29 crc kubenswrapper[4685]: I0321 03:47:29.078170 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:29Z","lastTransitionTime":"2026-03-21T03:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:29 crc kubenswrapper[4685]: E0321 03:47:29.093768 4685 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:47:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:47:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:47:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:47:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8bd8455e-cd7f-4a01-9ab2-39696fd22c82\\\",\\\"systemUUID\\\":\\\"9e08e2bc-a5ba-4ecf-a5ec-5e30e6b4a1fa\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 03:47:29 crc kubenswrapper[4685]: I0321 03:47:29.098605 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:29 crc kubenswrapper[4685]: I0321 03:47:29.098678 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:29 crc kubenswrapper[4685]: I0321 03:47:29.098696 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:29 crc kubenswrapper[4685]: I0321 03:47:29.098723 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:29 crc kubenswrapper[4685]: I0321 03:47:29.098742 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:29Z","lastTransitionTime":"2026-03-21T03:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:29 crc kubenswrapper[4685]: E0321 03:47:29.112505 4685 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:47:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:47:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:47:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:47:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8bd8455e-cd7f-4a01-9ab2-39696fd22c82\\\",\\\"systemUUID\\\":\\\"9e08e2bc-a5ba-4ecf-a5ec-5e30e6b4a1fa\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 03:47:29 crc kubenswrapper[4685]: E0321 03:47:29.112638 4685 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 21 03:47:29 crc kubenswrapper[4685]: E0321 03:47:29.112686 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:29 crc kubenswrapper[4685]: E0321 03:47:29.213737 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:29 crc kubenswrapper[4685]: E0321 03:47:29.314230 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:29 crc kubenswrapper[4685]: E0321 03:47:29.415308 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:29 crc kubenswrapper[4685]: E0321 03:47:29.515687 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:29 crc kubenswrapper[4685]: E0321 03:47:29.615889 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:29 crc kubenswrapper[4685]: E0321 03:47:29.716745 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:29 crc kubenswrapper[4685]: E0321 03:47:29.817510 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:29 crc kubenswrapper[4685]: E0321 03:47:29.918421 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:30 crc kubenswrapper[4685]: E0321 03:47:30.019099 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:30 crc kubenswrapper[4685]: E0321 03:47:30.120035 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:30 crc kubenswrapper[4685]: E0321 03:47:30.220639 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:30 crc kubenswrapper[4685]: E0321 03:47:30.321724 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:30 crc kubenswrapper[4685]: E0321 03:47:30.422034 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:30 crc kubenswrapper[4685]: E0321 03:47:30.523191 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:30 crc kubenswrapper[4685]: E0321 03:47:30.624009 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:30 crc kubenswrapper[4685]: E0321 03:47:30.724789 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:30 crc kubenswrapper[4685]: E0321 03:47:30.825381 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:30 crc kubenswrapper[4685]: E0321 03:47:30.925682 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:31 crc kubenswrapper[4685]: E0321 03:47:31.026682 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:31 crc kubenswrapper[4685]: E0321 03:47:31.127727 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:31 crc kubenswrapper[4685]: E0321 03:47:31.228325 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:31 crc kubenswrapper[4685]: E0321 03:47:31.329031 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:31 crc kubenswrapper[4685]: E0321 03:47:31.430304 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:31 crc kubenswrapper[4685]: E0321 03:47:31.531268 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:31 crc kubenswrapper[4685]: E0321 03:47:31.631698 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:31 crc kubenswrapper[4685]: E0321 03:47:31.732315 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:31 crc kubenswrapper[4685]: E0321 03:47:31.832647 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:31 crc kubenswrapper[4685]: E0321 03:47:31.933034 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:32 crc kubenswrapper[4685]: E0321 03:47:32.034025 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:32 crc kubenswrapper[4685]: E0321 03:47:32.134561 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:32 crc kubenswrapper[4685]: E0321 03:47:32.235196 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:32 crc kubenswrapper[4685]: E0321 03:47:32.336108 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:32 crc kubenswrapper[4685]: E0321 03:47:32.436395 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:32 crc kubenswrapper[4685]: E0321 03:47:32.537353 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:32 crc kubenswrapper[4685]: E0321 03:47:32.638138 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:32 crc kubenswrapper[4685]: E0321 03:47:32.738666 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:32 crc kubenswrapper[4685]: E0321 03:47:32.839790 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:32 crc kubenswrapper[4685]: E0321 03:47:32.940633 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:33 crc kubenswrapper[4685]: E0321 03:47:33.041536 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:33 crc kubenswrapper[4685]: E0321 03:47:33.142676 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:33 crc kubenswrapper[4685]: E0321 03:47:33.243002 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:33 crc kubenswrapper[4685]: E0321 03:47:33.343645 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:33 crc kubenswrapper[4685]: E0321 03:47:33.444120 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:33 crc kubenswrapper[4685]: E0321 03:47:33.545339 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:33 crc kubenswrapper[4685]: E0321 03:47:33.645549 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:33 crc kubenswrapper[4685]: E0321 03:47:33.746052 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:33 crc kubenswrapper[4685]: E0321 03:47:33.846804 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:33 crc kubenswrapper[4685]: E0321 03:47:33.947865 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:34 crc kubenswrapper[4685]: E0321 03:47:34.048568 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:34 crc kubenswrapper[4685]: E0321 03:47:34.149219 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:34 crc kubenswrapper[4685]: E0321 03:47:34.249941 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:34 crc kubenswrapper[4685]: E0321 03:47:34.350163 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:34 crc kubenswrapper[4685]: E0321 03:47:34.450306 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:34 crc kubenswrapper[4685]: E0321 03:47:34.550449 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:34 crc kubenswrapper[4685]: E0321 03:47:34.651173 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:34 crc kubenswrapper[4685]: E0321 03:47:34.751621 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:34 crc kubenswrapper[4685]: E0321 03:47:34.852653 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:34 crc kubenswrapper[4685]: E0321 03:47:34.953635 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:35 crc kubenswrapper[4685]: E0321 03:47:35.054787 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:35 crc kubenswrapper[4685]: E0321 03:47:35.155638 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:35 crc kubenswrapper[4685]: E0321 03:47:35.256486 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:35 crc kubenswrapper[4685]: E0321 03:47:35.357504 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:35 crc kubenswrapper[4685]: E0321 03:47:35.458356 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:35 crc kubenswrapper[4685]: E0321 03:47:35.558714 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:35 crc kubenswrapper[4685]: E0321 03:47:35.658955 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:35 crc kubenswrapper[4685]: E0321 03:47:35.759448 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:35 crc kubenswrapper[4685]: I0321 03:47:35.817645 4685 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 21 03:47:35 crc kubenswrapper[4685]: E0321 03:47:35.859611 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:35 crc kubenswrapper[4685]: E0321 03:47:35.959729 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:36 crc kubenswrapper[4685]: E0321 03:47:36.060437 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:36 crc kubenswrapper[4685]: E0321 03:47:36.161089 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:36 crc kubenswrapper[4685]: E0321 03:47:36.261873 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:36 crc kubenswrapper[4685]: E0321 03:47:36.362262 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:36 crc kubenswrapper[4685]: E0321 03:47:36.462588 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:36 crc kubenswrapper[4685]: E0321 03:47:36.563521 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:36 crc kubenswrapper[4685]: E0321 03:47:36.664368 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:36 crc kubenswrapper[4685]: E0321 03:47:36.764696 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:36 crc kubenswrapper[4685]: E0321 03:47:36.864823 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:36 crc kubenswrapper[4685]: E0321 03:47:36.965800 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:37 crc kubenswrapper[4685]: E0321 03:47:37.066920 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:37 crc kubenswrapper[4685]: E0321 03:47:37.168398 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:37 crc kubenswrapper[4685]: E0321 03:47:37.269327 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:37 crc kubenswrapper[4685]: I0321 03:47:37.300450 4685 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 03:47:37 crc kubenswrapper[4685]: I0321 03:47:37.302006 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:37 crc kubenswrapper[4685]: I0321 03:47:37.302160 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:37 crc kubenswrapper[4685]: I0321 03:47:37.302228 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:37 crc kubenswrapper[4685]: I0321 03:47:37.303014 4685 scope.go:117] "RemoveContainer" containerID="c896022c1e0726ccc5a54da9c432a0ff4082158ad76abcea60c0a1427c69134b" Mar 21 03:47:37 crc kubenswrapper[4685]: E0321 03:47:37.303309 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 21 03:47:37 crc kubenswrapper[4685]: E0321 03:47:37.370199 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:37 crc kubenswrapper[4685]: E0321 03:47:37.470756 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:37 crc kubenswrapper[4685]: E0321 03:47:37.572063 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:37 crc kubenswrapper[4685]: E0321 03:47:37.673175 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:37 crc kubenswrapper[4685]: E0321 03:47:37.774387 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:37 crc kubenswrapper[4685]: E0321 03:47:37.875223 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:37 crc kubenswrapper[4685]: E0321 03:47:37.976677 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:38 crc kubenswrapper[4685]: E0321 03:47:38.076956 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:38 crc kubenswrapper[4685]: E0321 03:47:38.177610 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:38 crc kubenswrapper[4685]: E0321 03:47:38.277896 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:38 crc kubenswrapper[4685]: E0321 03:47:38.378983 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:38 crc kubenswrapper[4685]: E0321 03:47:38.391276 4685 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 21 03:47:38 crc kubenswrapper[4685]: E0321 03:47:38.479416 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:38 crc kubenswrapper[4685]: E0321 03:47:38.580353 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:38 crc kubenswrapper[4685]: E0321 03:47:38.681438 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:38 crc kubenswrapper[4685]: E0321 03:47:38.781676 4685 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 03:47:38 crc kubenswrapper[4685]: I0321 03:47:38.799342 4685 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 21 03:47:38 crc kubenswrapper[4685]: I0321 03:47:38.884330 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:38 crc kubenswrapper[4685]: I0321 03:47:38.884397 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:38 crc kubenswrapper[4685]: I0321 03:47:38.884414 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:38 crc kubenswrapper[4685]: I0321 03:47:38.884444 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:38 crc kubenswrapper[4685]: I0321 03:47:38.884465 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:38Z","lastTransitionTime":"2026-03-21T03:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:38 crc kubenswrapper[4685]: I0321 03:47:38.987472 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:38 crc kubenswrapper[4685]: I0321 03:47:38.987550 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:38 crc kubenswrapper[4685]: I0321 03:47:38.987573 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:38 crc kubenswrapper[4685]: I0321 03:47:38.987607 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:38 crc kubenswrapper[4685]: I0321 03:47:38.987631 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:38Z","lastTransitionTime":"2026-03-21T03:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.091294 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.091367 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.091392 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.091428 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.091452 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:39Z","lastTransitionTime":"2026-03-21T03:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.194877 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.195276 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.195374 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.195479 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.195569 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:39Z","lastTransitionTime":"2026-03-21T03:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.249846 4685 apiserver.go:52] "Watching apiserver" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.262972 4685 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.263709 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-machine-config-operator/machine-config-daemon-7r9cg","openshift-multus/multus-7jcm2","openshift-multus/network-metrics-daemon-v9rdl","openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jrr5c","openshift-dns/node-resolver-mlsb2","openshift-multus/multus-additional-cni-plugins-6xvsf","openshift-ovn-kubernetes/ovnkube-node-cpfzk","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-image-registry/node-ca-ztl6v","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"] Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.264231 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.264403 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 03:47:39 crc kubenswrapper[4685]: E0321 03:47:39.264742 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.265134 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 03:47:39 crc kubenswrapper[4685]: E0321 03:47:39.265378 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.268111 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.268229 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.268405 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.269183 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.269285 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 21 03:47:39 crc kubenswrapper[4685]: E0321 03:47:39.269399 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.270539 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jrr5c" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.270616 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-mlsb2" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.270662 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.270937 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.270989 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-6xvsf" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.271119 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v9rdl" Mar 21 03:47:39 crc kubenswrapper[4685]: E0321 03:47:39.271409 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v9rdl" podUID="fda9b1ff-e4a8-4d15-8f7b-2974991cd252" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.272001 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-7jcm2" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.272225 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-ztl6v" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.272452 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.274322 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.274998 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.275395 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.275816 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.276511 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.276956 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.277892 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.278044 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.278073 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.278314 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.278355 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.278601 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.278762 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.278799 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.279062 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.279254 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.279296 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.279392 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.279621 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.279674 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.279694 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.280080 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.280179 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.280193 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.280195 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.280241 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.280326 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.280381 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.280408 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.280422 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.280480 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.280938 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.286332 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.287026 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.297590 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.298553 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.298622 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.298650 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.298684 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.298708 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:39Z","lastTransitionTime":"2026-03-21T03:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.316729 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.324238 4685 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.335114 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.355874 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-v9rdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fda9b1ff-e4a8-4d15-8f7b-2974991cd252\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrczq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrczq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-v9rdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.374041 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08dfc393-0ddb-4bde-9b1f-2a48549f4549\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cpfzk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.386419 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mlsb2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b79f01f-bf05-4f7d-b816-6ef01f21e949\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6tf6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mlsb2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.401154 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.401216 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.401229 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.401273 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.401294 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:39Z","lastTransitionTime":"2026-03-21T03:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.401971 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.409228 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.409315 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.409376 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.409434 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.409494 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.409552 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.409618 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.409680 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.409738 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.409798 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.409887 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.409944 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.410000 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.410050 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.410141 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.410202 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.410260 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.410325 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.410382 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.410544 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.410614 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.410669 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.410720 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.410780 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.410886 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.410949 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.411012 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.411064 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.411125 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.411188 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.411242 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.411302 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.411361 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.411421 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.411487 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.411548 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.411605 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.411662 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.411715 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.411779 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.411874 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.411928 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.411985 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.412117 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.412177 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.412251 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.412316 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.412380 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.412442 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.413228 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.413261 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.413288 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.413309 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.413329 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.413353 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.413376 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.413458 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.413477 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.413494 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.413510 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.413528 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.413544 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.413562 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.413579 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.413599 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.413619 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.413643 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.413660 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.413682 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.413700 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.413718 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.413733 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.413751 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.413769 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.413787 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.413809 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.413826 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.413862 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.413882 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.413900 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.413920 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.414023 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.414044 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.414061 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.414080 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.414097 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.414140 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.414159 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.414181 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.414198 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.414220 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.414237 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.414252 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.414282 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.414300 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.414318 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.414337 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.414354 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.414372 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.412756 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.414389 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.414407 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.414424 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.414441 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.414458 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.414475 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.414491 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.414510 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.414527 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.414544 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.414561 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.414558 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.414564 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.414701 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.413133 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.415064 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.415144 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.415335 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.415453 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.415896 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.416066 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.416111 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.416201 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.416229 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.416241 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.416540 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.416506 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ztl6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dea34cf-d6c6-42fd-b4aa-8e175c6a78f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sc2qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ztl6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.416910 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.416921 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.417024 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.413268 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.413365 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.413360 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.417113 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.413657 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.413665 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.413777 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.413810 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.417190 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.417452 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.414581 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.417507 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.417539 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.417560 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.417579 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.417599 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.417617 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.417634 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.417652 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.417669 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.417688 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.417706 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.417726 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.417743 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.417759 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.417776 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.417793 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.417816 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.417833 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.417873 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.417891 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.417908 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.417926 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.417943 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.417962 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.417994 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.418010 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.418027 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.418044 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.418060 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.418077 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.418093 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.418141 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.418159 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.418178 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.418197 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.418213 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.418236 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.418253 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.418270 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.418287 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.418308 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.418327 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.418344 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.418360 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.418378 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.418394 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.418413 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.418430 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.418447 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.418471 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.418487 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.418503 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.418521 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.418539 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.418556 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.418574 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.418596 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.418614 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.418632 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.418651 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.418671 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.418689 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.418710 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.418730 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.418749 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.418768 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.418788 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.418821 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.418857 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.418876 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.418896 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.418914 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.418937 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.418956 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.418978 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.418998 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.419019 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.419039 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.419060 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.419078 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.419095 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.419112 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.419131 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.419150 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.419215 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-systemd-units\") pod \"ovnkube-node-cpfzk\" (UID: \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.419237 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-host-run-ovn-kubernetes\") pod \"ovnkube-node-cpfzk\" (UID: \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.419256 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cd9b1743-6b69-46d3-a429-6f83bf43317a-host-var-lib-cni-bin\") pod \"multus-7jcm2\" (UID: \"cd9b1743-6b69-46d3-a429-6f83bf43317a\") " pod="openshift-multus/multus-7jcm2" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.419274 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/cd9b1743-6b69-46d3-a429-6f83bf43317a-hostroot\") pod \"multus-7jcm2\" (UID: \"cd9b1743-6b69-46d3-a429-6f83bf43317a\") " pod="openshift-multus/multus-7jcm2" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.419292 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/cd9b1743-6b69-46d3-a429-6f83bf43317a-host-run-multus-certs\") pod \"multus-7jcm2\" (UID: \"cd9b1743-6b69-46d3-a429-6f83bf43317a\") " pod="openshift-multus/multus-7jcm2" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.419312 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngpjd\" (UniqueName: \"kubernetes.io/projected/a5cb18bc-bda7-463e-98fe-6d8ff293b949-kube-api-access-ngpjd\") pod \"multus-additional-cni-plugins-6xvsf\" (UID: \"a5cb18bc-bda7-463e-98fe-6d8ff293b949\") " pod="openshift-multus/multus-additional-cni-plugins-6xvsf" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.419328 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-host-cni-netd\") pod \"ovnkube-node-cpfzk\" (UID: \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.419351 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.419368 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/cd9b1743-6b69-46d3-a429-6f83bf43317a-host-run-k8s-cni-cncf-io\") pod \"multus-7jcm2\" (UID: \"cd9b1743-6b69-46d3-a429-6f83bf43317a\") " pod="openshift-multus/multus-7jcm2" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.419388 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fda9b1ff-e4a8-4d15-8f7b-2974991cd252-metrics-certs\") pod \"network-metrics-daemon-v9rdl\" (UID: \"fda9b1ff-e4a8-4d15-8f7b-2974991cd252\") " pod="openshift-multus/network-metrics-daemon-v9rdl" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.419406 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/cea46fe2-4e41-43ab-a069-cb30fb4e732c-rootfs\") pod \"machine-config-daemon-7r9cg\" (UID: \"cea46fe2-4e41-43ab-a069-cb30fb4e732c\") " pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.419422 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cea46fe2-4e41-43ab-a069-cb30fb4e732c-proxy-tls\") pod \"machine-config-daemon-7r9cg\" (UID: \"cea46fe2-4e41-43ab-a069-cb30fb4e732c\") " pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.419440 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-run-ovn\") pod \"ovnkube-node-cpfzk\" (UID: \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.419456 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/08dfc393-0ddb-4bde-9b1f-2a48549f4549-ovn-node-metrics-cert\") pod \"ovnkube-node-cpfzk\" (UID: \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.419472 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cd9b1743-6b69-46d3-a429-6f83bf43317a-system-cni-dir\") pod \"multus-7jcm2\" (UID: \"cd9b1743-6b69-46d3-a429-6f83bf43317a\") " pod="openshift-multus/multus-7jcm2" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.419493 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.419515 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5ntg\" (UniqueName: \"kubernetes.io/projected/cea46fe2-4e41-43ab-a069-cb30fb4e732c-kube-api-access-j5ntg\") pod \"machine-config-daemon-7r9cg\" (UID: \"cea46fe2-4e41-43ab-a069-cb30fb4e732c\") " pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.419535 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-cpfzk\" (UID: \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.419556 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/08dfc393-0ddb-4bde-9b1f-2a48549f4549-ovnkube-script-lib\") pod \"ovnkube-node-cpfzk\" (UID: \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.419572 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cd9b1743-6b69-46d3-a429-6f83bf43317a-multus-cni-dir\") pod \"multus-7jcm2\" (UID: \"cd9b1743-6b69-46d3-a429-6f83bf43317a\") " pod="openshift-multus/multus-7jcm2" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.419592 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2521a678-ad6c-464b-bf7b-c4f6237c2822-env-overrides\") pod \"ovnkube-control-plane-749d76644c-jrr5c\" (UID: \"2521a678-ad6c-464b-bf7b-c4f6237c2822\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jrr5c" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.419609 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-node-log\") pod \"ovnkube-node-cpfzk\" (UID: \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.419626 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x266z\" (UniqueName: \"kubernetes.io/projected/cd9b1743-6b69-46d3-a429-6f83bf43317a-kube-api-access-x266z\") pod \"multus-7jcm2\" (UID: \"cd9b1743-6b69-46d3-a429-6f83bf43317a\") " pod="openshift-multus/multus-7jcm2" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.419645 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7m8g7\" (UniqueName: \"kubernetes.io/projected/2521a678-ad6c-464b-bf7b-c4f6237c2822-kube-api-access-7m8g7\") pod \"ovnkube-control-plane-749d76644c-jrr5c\" (UID: \"2521a678-ad6c-464b-bf7b-c4f6237c2822\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jrr5c" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.419664 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-etc-openvswitch\") pod \"ovnkube-node-cpfzk\" (UID: \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.419680 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cd9b1743-6b69-46d3-a429-6f83bf43317a-cnibin\") pod \"multus-7jcm2\" (UID: \"cd9b1743-6b69-46d3-a429-6f83bf43317a\") " pod="openshift-multus/multus-7jcm2" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.419695 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/cd9b1743-6b69-46d3-a429-6f83bf43317a-multus-socket-dir-parent\") pod \"multus-7jcm2\" (UID: \"cd9b1743-6b69-46d3-a429-6f83bf43317a\") " pod="openshift-multus/multus-7jcm2" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.419711 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cd9b1743-6b69-46d3-a429-6f83bf43317a-host-run-netns\") pod \"multus-7jcm2\" (UID: \"cd9b1743-6b69-46d3-a429-6f83bf43317a\") " pod="openshift-multus/multus-7jcm2" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.419726 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/cd9b1743-6b69-46d3-a429-6f83bf43317a-host-var-lib-cni-multus\") pod \"multus-7jcm2\" (UID: \"cd9b1743-6b69-46d3-a429-6f83bf43317a\") " pod="openshift-multus/multus-7jcm2" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.419741 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2521a678-ad6c-464b-bf7b-c4f6237c2822-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-jrr5c\" (UID: \"2521a678-ad6c-464b-bf7b-c4f6237c2822\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jrr5c" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.419758 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.419774 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-run-systemd\") pod \"ovnkube-node-cpfzk\" (UID: \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.419794 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.419812 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cd9b1743-6b69-46d3-a429-6f83bf43317a-host-var-lib-kubelet\") pod \"multus-7jcm2\" (UID: \"cd9b1743-6b69-46d3-a429-6f83bf43317a\") " pod="openshift-multus/multus-7jcm2" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.419830 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a5cb18bc-bda7-463e-98fe-6d8ff293b949-os-release\") pod \"multus-additional-cni-plugins-6xvsf\" (UID: \"a5cb18bc-bda7-463e-98fe-6d8ff293b949\") " pod="openshift-multus/multus-additional-cni-plugins-6xvsf" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.419861 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a5cb18bc-bda7-463e-98fe-6d8ff293b949-cnibin\") pod \"multus-additional-cni-plugins-6xvsf\" (UID: \"a5cb18bc-bda7-463e-98fe-6d8ff293b949\") " pod="openshift-multus/multus-additional-cni-plugins-6xvsf" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.419878 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a5cb18bc-bda7-463e-98fe-6d8ff293b949-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6xvsf\" (UID: \"a5cb18bc-bda7-463e-98fe-6d8ff293b949\") " pod="openshift-multus/multus-additional-cni-plugins-6xvsf" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.419898 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.419919 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-host-slash\") pod \"ovnkube-node-cpfzk\" (UID: \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.419940 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdrfx\" (UniqueName: \"kubernetes.io/projected/08dfc393-0ddb-4bde-9b1f-2a48549f4549-kube-api-access-fdrfx\") pod \"ovnkube-node-cpfzk\" (UID: \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.419961 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tf6h\" (UniqueName: \"kubernetes.io/projected/7b79f01f-bf05-4f7d-b816-6ef01f21e949-kube-api-access-6tf6h\") pod \"node-resolver-mlsb2\" (UID: \"7b79f01f-bf05-4f7d-b816-6ef01f21e949\") " pod="openshift-dns/node-resolver-mlsb2" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.419980 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cd9b1743-6b69-46d3-a429-6f83bf43317a-cni-binary-copy\") pod \"multus-7jcm2\" (UID: \"cd9b1743-6b69-46d3-a429-6f83bf43317a\") " pod="openshift-multus/multus-7jcm2" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.420001 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sc2qv\" (UniqueName: \"kubernetes.io/projected/4dea34cf-d6c6-42fd-b4aa-8e175c6a78f6-kube-api-access-sc2qv\") pod \"node-ca-ztl6v\" (UID: \"4dea34cf-d6c6-42fd-b4aa-8e175c6a78f6\") " pod="openshift-image-registry/node-ca-ztl6v" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.420021 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.420044 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.420068 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.420085 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4dea34cf-d6c6-42fd-b4aa-8e175c6a78f6-serviceca\") pod \"node-ca-ztl6v\" (UID: \"4dea34cf-d6c6-42fd-b4aa-8e175c6a78f6\") " pod="openshift-image-registry/node-ca-ztl6v" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.420103 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/08dfc393-0ddb-4bde-9b1f-2a48549f4549-env-overrides\") pod \"ovnkube-node-cpfzk\" (UID: \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.420123 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/7b79f01f-bf05-4f7d-b816-6ef01f21e949-hosts-file\") pod \"node-resolver-mlsb2\" (UID: \"7b79f01f-bf05-4f7d-b816-6ef01f21e949\") " pod="openshift-dns/node-resolver-mlsb2" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.420140 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cd9b1743-6b69-46d3-a429-6f83bf43317a-os-release\") pod \"multus-7jcm2\" (UID: \"cd9b1743-6b69-46d3-a429-6f83bf43317a\") " pod="openshift-multus/multus-7jcm2" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.420158 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4dea34cf-d6c6-42fd-b4aa-8e175c6a78f6-host\") pod \"node-ca-ztl6v\" (UID: \"4dea34cf-d6c6-42fd-b4aa-8e175c6a78f6\") " pod="openshift-image-registry/node-ca-ztl6v" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.420174 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-var-lib-openvswitch\") pod \"ovnkube-node-cpfzk\" (UID: \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.420192 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a5cb18bc-bda7-463e-98fe-6d8ff293b949-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6xvsf\" (UID: \"a5cb18bc-bda7-463e-98fe-6d8ff293b949\") " pod="openshift-multus/multus-additional-cni-plugins-6xvsf" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.420208 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-host-kubelet\") pod \"ovnkube-node-cpfzk\" (UID: \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.420224 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-host-run-netns\") pod \"ovnkube-node-cpfzk\" (UID: \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.420238 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-log-socket\") pod \"ovnkube-node-cpfzk\" (UID: \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.420258 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.420275 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrczq\" (UniqueName: \"kubernetes.io/projected/fda9b1ff-e4a8-4d15-8f7b-2974991cd252-kube-api-access-wrczq\") pod \"network-metrics-daemon-v9rdl\" (UID: \"fda9b1ff-e4a8-4d15-8f7b-2974991cd252\") " pod="openshift-multus/network-metrics-daemon-v9rdl" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.420296 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.420312 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-host-cni-bin\") pod \"ovnkube-node-cpfzk\" (UID: \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.420332 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.420353 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/08dfc393-0ddb-4bde-9b1f-2a48549f4549-ovnkube-config\") pod \"ovnkube-node-cpfzk\" (UID: \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.420372 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cd9b1743-6b69-46d3-a429-6f83bf43317a-etc-kubernetes\") pod \"multus-7jcm2\" (UID: \"cd9b1743-6b69-46d3-a429-6f83bf43317a\") " pod="openshift-multus/multus-7jcm2" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.420389 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2521a678-ad6c-464b-bf7b-c4f6237c2822-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-jrr5c\" (UID: \"2521a678-ad6c-464b-bf7b-c4f6237c2822\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jrr5c" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.420409 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cea46fe2-4e41-43ab-a069-cb30fb4e732c-mcd-auth-proxy-config\") pod \"machine-config-daemon-7r9cg\" (UID: \"cea46fe2-4e41-43ab-a069-cb30fb4e732c\") " pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.420431 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.420448 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-run-openvswitch\") pod \"ovnkube-node-cpfzk\" (UID: \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.420466 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.420486 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/cd9b1743-6b69-46d3-a429-6f83bf43317a-multus-daemon-config\") pod \"multus-7jcm2\" (UID: \"cd9b1743-6b69-46d3-a429-6f83bf43317a\") " pod="openshift-multus/multus-7jcm2" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.420504 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a5cb18bc-bda7-463e-98fe-6d8ff293b949-cni-binary-copy\") pod \"multus-additional-cni-plugins-6xvsf\" (UID: \"a5cb18bc-bda7-463e-98fe-6d8ff293b949\") " pod="openshift-multus/multus-additional-cni-plugins-6xvsf" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.420521 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cd9b1743-6b69-46d3-a429-6f83bf43317a-multus-conf-dir\") pod \"multus-7jcm2\" (UID: \"cd9b1743-6b69-46d3-a429-6f83bf43317a\") " pod="openshift-multus/multus-7jcm2" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.420542 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.420560 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a5cb18bc-bda7-463e-98fe-6d8ff293b949-system-cni-dir\") pod \"multus-additional-cni-plugins-6xvsf\" (UID: \"a5cb18bc-bda7-463e-98fe-6d8ff293b949\") " pod="openshift-multus/multus-additional-cni-plugins-6xvsf" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.420630 4685 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.420643 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.420655 4685 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.420666 4685 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.420677 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.420688 4685 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.420698 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.420709 4685 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.420719 4685 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.420730 4685 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.420740 4685 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.420752 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.420764 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.420775 4685 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.420785 4685 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.420796 4685 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.420806 4685 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.420816 4685 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.420827 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.420849 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.420859 4685 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.420871 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.420881 4685 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.420890 4685 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.420901 4685 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.420911 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.420921 4685 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.413910 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.416248 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.413910 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.414081 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.414253 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.414323 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.414363 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.414576 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.417059 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.417423 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.417483 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.417778 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.418566 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.418634 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.418402 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.418864 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.418899 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.419881 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.419904 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.420372 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.420729 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: E0321 03:47:39.420986 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 03:47:39.920965943 +0000 UTC m=+92.398034735 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.421482 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.421506 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.421902 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.421901 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.422210 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.422369 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.422691 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.422876 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.423034 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.423371 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.423428 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.423767 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.423883 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.424401 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.424464 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.424508 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.424551 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.425006 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.424003 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.425220 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.425245 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.425369 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.425509 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.425570 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.425577 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.425680 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.426277 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.426586 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.426909 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.427028 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.427057 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.427219 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.427270 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.427416 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.427436 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.427588 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.427702 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.427942 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.428007 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.413126 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.428478 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.428620 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.428897 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.428983 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.429034 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.429333 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.429443 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.429603 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.429307 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.429815 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: E0321 03:47:39.429870 4685 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.429795 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.430201 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.430430 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.430441 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.430514 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.430554 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.430670 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.430857 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.430990 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.430969 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.431038 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.431086 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.431132 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.431322 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.431669 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.432051 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.432239 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.432273 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.413215 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.432046 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: E0321 03:47:39.432422 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 03:47:39.932411095 +0000 UTC m=+92.409479887 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.432533 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.432728 4685 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.432776 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.433731 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.434098 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.434214 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.434815 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.435744 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.435759 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.434825 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.435580 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.435590 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.435676 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.435724 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.435896 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.435905 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.436591 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.437663 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.437919 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 03:47:39 crc kubenswrapper[4685]: E0321 03:47:39.442307 4685 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 21 03:47:39 crc kubenswrapper[4685]: E0321 03:47:39.442409 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 03:47:39.942383204 +0000 UTC m=+92.419452006 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 21 03:47:39 crc kubenswrapper[4685]: E0321 03:47:39.445727 4685 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 21 03:47:39 crc kubenswrapper[4685]: E0321 03:47:39.445760 4685 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 21 03:47:39 crc kubenswrapper[4685]: E0321 03:47:39.445781 4685 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 03:47:39 crc kubenswrapper[4685]: E0321 03:47:39.445888 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-21 03:47:39.945867225 +0000 UTC m=+92.422936027 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 03:47:39 crc kubenswrapper[4685]: E0321 03:47:39.447146 4685 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 21 03:47:39 crc kubenswrapper[4685]: E0321 03:47:39.447185 4685 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 21 03:47:39 crc kubenswrapper[4685]: E0321 03:47:39.447202 4685 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 03:47:39 crc kubenswrapper[4685]: E0321 03:47:39.447290 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-21 03:47:39.947263516 +0000 UTC m=+92.424332318 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.448738 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.450953 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jrr5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2521a678-ad6c-464b-bf7b-c4f6237c2822\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jrr5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.457712 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.457787 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.458555 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.460130 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.460509 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.460770 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.460826 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.461038 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.461406 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.461314 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.461565 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.461645 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.463013 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cea46fe2-4e41-43ab-a069-cb30fb4e732c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5ntg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5ntg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7r9cg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.464002 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.464118 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.464312 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.464489 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.465156 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.465694 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.465867 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.466060 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.466492 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.466951 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.466939 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.470193 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.470228 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.473054 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.473093 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.473471 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.473532 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.473986 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.474675 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.475284 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.475734 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.475979 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.476228 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.476233 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.476282 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.476294 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.476367 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.476685 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.476723 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.476740 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.476906 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.476968 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.477014 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.477180 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.477209 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.477491 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.477614 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.477903 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.477963 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.478052 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.478176 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.478313 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.478408 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.478641 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.478745 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.478904 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.481133 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.482073 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.483006 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.490965 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6xvsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5cb18bc-bda7-463e-98fe-6d8ff293b949\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6xvsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.500677 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.503242 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7jcm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd9b1743-6b69-46d3-a429-6f83bf43317a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x266z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7jcm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.504734 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.505081 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.505134 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.505148 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.505154 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.505172 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.505217 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:39Z","lastTransitionTime":"2026-03-21T03:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.510207 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.511395 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.511428 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.511453 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.511494 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.511504 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:39Z","lastTransitionTime":"2026-03-21T03:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.515691 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.522119 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/cd9b1743-6b69-46d3-a429-6f83bf43317a-host-var-lib-cni-multus\") pod \"multus-7jcm2\" (UID: \"cd9b1743-6b69-46d3-a429-6f83bf43317a\") " pod="openshift-multus/multus-7jcm2" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.522164 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2521a678-ad6c-464b-bf7b-c4f6237c2822-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-jrr5c\" (UID: \"2521a678-ad6c-464b-bf7b-c4f6237c2822\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jrr5c" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.522184 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.522206 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-etc-openvswitch\") pod \"ovnkube-node-cpfzk\" (UID: \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.522226 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cd9b1743-6b69-46d3-a429-6f83bf43317a-cnibin\") pod \"multus-7jcm2\" (UID: \"cd9b1743-6b69-46d3-a429-6f83bf43317a\") " pod="openshift-multus/multus-7jcm2" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.522245 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/cd9b1743-6b69-46d3-a429-6f83bf43317a-multus-socket-dir-parent\") pod \"multus-7jcm2\" (UID: \"cd9b1743-6b69-46d3-a429-6f83bf43317a\") " pod="openshift-multus/multus-7jcm2" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.522262 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cd9b1743-6b69-46d3-a429-6f83bf43317a-host-run-netns\") pod \"multus-7jcm2\" (UID: \"cd9b1743-6b69-46d3-a429-6f83bf43317a\") " pod="openshift-multus/multus-7jcm2" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.522279 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a5cb18bc-bda7-463e-98fe-6d8ff293b949-os-release\") pod \"multus-additional-cni-plugins-6xvsf\" (UID: \"a5cb18bc-bda7-463e-98fe-6d8ff293b949\") " pod="openshift-multus/multus-additional-cni-plugins-6xvsf" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.522298 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-run-systemd\") pod \"ovnkube-node-cpfzk\" (UID: \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.522314 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cd9b1743-6b69-46d3-a429-6f83bf43317a-host-var-lib-kubelet\") pod \"multus-7jcm2\" (UID: \"cd9b1743-6b69-46d3-a429-6f83bf43317a\") " pod="openshift-multus/multus-7jcm2" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.522333 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdrfx\" (UniqueName: \"kubernetes.io/projected/08dfc393-0ddb-4bde-9b1f-2a48549f4549-kube-api-access-fdrfx\") pod \"ovnkube-node-cpfzk\" (UID: \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.522332 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/cd9b1743-6b69-46d3-a429-6f83bf43317a-host-var-lib-cni-multus\") pod \"multus-7jcm2\" (UID: \"cd9b1743-6b69-46d3-a429-6f83bf43317a\") " pod="openshift-multus/multus-7jcm2" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.522367 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tf6h\" (UniqueName: \"kubernetes.io/projected/7b79f01f-bf05-4f7d-b816-6ef01f21e949-kube-api-access-6tf6h\") pod \"node-resolver-mlsb2\" (UID: \"7b79f01f-bf05-4f7d-b816-6ef01f21e949\") " pod="openshift-dns/node-resolver-mlsb2" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.522394 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cd9b1743-6b69-46d3-a429-6f83bf43317a-cni-binary-copy\") pod \"multus-7jcm2\" (UID: \"cd9b1743-6b69-46d3-a429-6f83bf43317a\") " pod="openshift-multus/multus-7jcm2" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.522409 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-etc-openvswitch\") pod \"ovnkube-node-cpfzk\" (UID: \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.522410 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a5cb18bc-bda7-463e-98fe-6d8ff293b949-cnibin\") pod \"multus-additional-cni-plugins-6xvsf\" (UID: \"a5cb18bc-bda7-463e-98fe-6d8ff293b949\") " pod="openshift-multus/multus-additional-cni-plugins-6xvsf" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.522444 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a5cb18bc-bda7-463e-98fe-6d8ff293b949-cnibin\") pod \"multus-additional-cni-plugins-6xvsf\" (UID: \"a5cb18bc-bda7-463e-98fe-6d8ff293b949\") " pod="openshift-multus/multus-additional-cni-plugins-6xvsf" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.522449 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cd9b1743-6b69-46d3-a429-6f83bf43317a-host-var-lib-kubelet\") pod \"multus-7jcm2\" (UID: \"cd9b1743-6b69-46d3-a429-6f83bf43317a\") " pod="openshift-multus/multus-7jcm2" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.522444 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a5cb18bc-bda7-463e-98fe-6d8ff293b949-os-release\") pod \"multus-additional-cni-plugins-6xvsf\" (UID: \"a5cb18bc-bda7-463e-98fe-6d8ff293b949\") " pod="openshift-multus/multus-additional-cni-plugins-6xvsf" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.522516 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-run-systemd\") pod \"ovnkube-node-cpfzk\" (UID: \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.522521 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/cd9b1743-6b69-46d3-a429-6f83bf43317a-multus-socket-dir-parent\") pod \"multus-7jcm2\" (UID: \"cd9b1743-6b69-46d3-a429-6f83bf43317a\") " pod="openshift-multus/multus-7jcm2" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.522448 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cd9b1743-6b69-46d3-a429-6f83bf43317a-cnibin\") pod \"multus-7jcm2\" (UID: \"cd9b1743-6b69-46d3-a429-6f83bf43317a\") " pod="openshift-multus/multus-7jcm2" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.522569 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cd9b1743-6b69-46d3-a429-6f83bf43317a-host-run-netns\") pod \"multus-7jcm2\" (UID: \"cd9b1743-6b69-46d3-a429-6f83bf43317a\") " pod="openshift-multus/multus-7jcm2" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.522474 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a5cb18bc-bda7-463e-98fe-6d8ff293b949-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6xvsf\" (UID: \"a5cb18bc-bda7-463e-98fe-6d8ff293b949\") " pod="openshift-multus/multus-additional-cni-plugins-6xvsf" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.522576 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.522611 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-host-slash\") pod \"ovnkube-node-cpfzk\" (UID: \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.522596 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-host-slash\") pod \"ovnkube-node-cpfzk\" (UID: \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.522677 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.522720 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sc2qv\" (UniqueName: \"kubernetes.io/projected/4dea34cf-d6c6-42fd-b4aa-8e175c6a78f6-kube-api-access-sc2qv\") pod \"node-ca-ztl6v\" (UID: \"4dea34cf-d6c6-42fd-b4aa-8e175c6a78f6\") " pod="openshift-image-registry/node-ca-ztl6v" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.522762 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4dea34cf-d6c6-42fd-b4aa-8e175c6a78f6-serviceca\") pod \"node-ca-ztl6v\" (UID: \"4dea34cf-d6c6-42fd-b4aa-8e175c6a78f6\") " pod="openshift-image-registry/node-ca-ztl6v" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.522800 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-var-lib-openvswitch\") pod \"ovnkube-node-cpfzk\" (UID: \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.522814 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.522858 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/08dfc393-0ddb-4bde-9b1f-2a48549f4549-env-overrides\") pod \"ovnkube-node-cpfzk\" (UID: \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.522902 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/7b79f01f-bf05-4f7d-b816-6ef01f21e949-hosts-file\") pod \"node-resolver-mlsb2\" (UID: \"7b79f01f-bf05-4f7d-b816-6ef01f21e949\") " pod="openshift-dns/node-resolver-mlsb2" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.522936 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cd9b1743-6b69-46d3-a429-6f83bf43317a-os-release\") pod \"multus-7jcm2\" (UID: \"cd9b1743-6b69-46d3-a429-6f83bf43317a\") " pod="openshift-multus/multus-7jcm2" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.522968 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4dea34cf-d6c6-42fd-b4aa-8e175c6a78f6-host\") pod \"node-ca-ztl6v\" (UID: \"4dea34cf-d6c6-42fd-b4aa-8e175c6a78f6\") " pod="openshift-image-registry/node-ca-ztl6v" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.523003 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrczq\" (UniqueName: \"kubernetes.io/projected/fda9b1ff-e4a8-4d15-8f7b-2974991cd252-kube-api-access-wrczq\") pod \"network-metrics-daemon-v9rdl\" (UID: \"fda9b1ff-e4a8-4d15-8f7b-2974991cd252\") " pod="openshift-multus/network-metrics-daemon-v9rdl" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.523039 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a5cb18bc-bda7-463e-98fe-6d8ff293b949-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6xvsf\" (UID: \"a5cb18bc-bda7-463e-98fe-6d8ff293b949\") " pod="openshift-multus/multus-additional-cni-plugins-6xvsf" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.523071 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-host-kubelet\") pod \"ovnkube-node-cpfzk\" (UID: \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.523104 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-host-run-netns\") pod \"ovnkube-node-cpfzk\" (UID: \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" Mar 21 03:47:39 crc kubenswrapper[4685]: E0321 03:47:39.523013 4685 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8bd8455e-cd7f-4a01-9ab2-39696fd22c82\\\",\\\"systemUUID\\\":\\\"9e08e2bc-a5ba-4ecf-a5ec-5e30e6b4a1fa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.523146 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-log-socket\") pod \"ovnkube-node-cpfzk\" (UID: \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.523203 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-host-cni-bin\") pod \"ovnkube-node-cpfzk\" (UID: \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.523255 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/08dfc393-0ddb-4bde-9b1f-2a48549f4549-ovnkube-config\") pod \"ovnkube-node-cpfzk\" (UID: \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.523264 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4dea34cf-d6c6-42fd-b4aa-8e175c6a78f6-host\") pod \"node-ca-ztl6v\" (UID: \"4dea34cf-d6c6-42fd-b4aa-8e175c6a78f6\") " pod="openshift-image-registry/node-ca-ztl6v" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.523291 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cea46fe2-4e41-43ab-a069-cb30fb4e732c-mcd-auth-proxy-config\") pod \"machine-config-daemon-7r9cg\" (UID: \"cea46fe2-4e41-43ab-a069-cb30fb4e732c\") " pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.523299 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2521a678-ad6c-464b-bf7b-c4f6237c2822-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-jrr5c\" (UID: \"2521a678-ad6c-464b-bf7b-c4f6237c2822\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jrr5c" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.523340 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cd9b1743-6b69-46d3-a429-6f83bf43317a-etc-kubernetes\") pod \"multus-7jcm2\" (UID: \"cd9b1743-6b69-46d3-a429-6f83bf43317a\") " pod="openshift-multus/multus-7jcm2" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.523316 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-host-run-netns\") pod \"ovnkube-node-cpfzk\" (UID: \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.523382 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2521a678-ad6c-464b-bf7b-c4f6237c2822-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-jrr5c\" (UID: \"2521a678-ad6c-464b-bf7b-c4f6237c2822\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jrr5c" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.523427 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-run-openvswitch\") pod \"ovnkube-node-cpfzk\" (UID: \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.523459 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/cd9b1743-6b69-46d3-a429-6f83bf43317a-multus-daemon-config\") pod \"multus-7jcm2\" (UID: \"cd9b1743-6b69-46d3-a429-6f83bf43317a\") " pod="openshift-multus/multus-7jcm2" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.523498 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a5cb18bc-bda7-463e-98fe-6d8ff293b949-system-cni-dir\") pod \"multus-additional-cni-plugins-6xvsf\" (UID: \"a5cb18bc-bda7-463e-98fe-6d8ff293b949\") " pod="openshift-multus/multus-additional-cni-plugins-6xvsf" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.523536 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a5cb18bc-bda7-463e-98fe-6d8ff293b949-cni-binary-copy\") pod \"multus-additional-cni-plugins-6xvsf\" (UID: \"a5cb18bc-bda7-463e-98fe-6d8ff293b949\") " pod="openshift-multus/multus-additional-cni-plugins-6xvsf" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.523581 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cd9b1743-6b69-46d3-a429-6f83bf43317a-multus-conf-dir\") pod \"multus-7jcm2\" (UID: \"cd9b1743-6b69-46d3-a429-6f83bf43317a\") " pod="openshift-multus/multus-7jcm2" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.523584 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/7b79f01f-bf05-4f7d-b816-6ef01f21e949-hosts-file\") pod \"node-resolver-mlsb2\" (UID: \"7b79f01f-bf05-4f7d-b816-6ef01f21e949\") " pod="openshift-dns/node-resolver-mlsb2" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.523629 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/cd9b1743-6b69-46d3-a429-6f83bf43317a-host-run-multus-certs\") pod \"multus-7jcm2\" (UID: \"cd9b1743-6b69-46d3-a429-6f83bf43317a\") " pod="openshift-multus/multus-7jcm2" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.523637 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-host-kubelet\") pod \"ovnkube-node-cpfzk\" (UID: \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.523671 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-systemd-units\") pod \"ovnkube-node-cpfzk\" (UID: \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.523705 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-host-run-ovn-kubernetes\") pod \"ovnkube-node-cpfzk\" (UID: \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.523737 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cd9b1743-6b69-46d3-a429-6f83bf43317a-host-var-lib-cni-bin\") pod \"multus-7jcm2\" (UID: \"cd9b1743-6b69-46d3-a429-6f83bf43317a\") " pod="openshift-multus/multus-7jcm2" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.523775 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/cd9b1743-6b69-46d3-a429-6f83bf43317a-hostroot\") pod \"multus-7jcm2\" (UID: \"cd9b1743-6b69-46d3-a429-6f83bf43317a\") " pod="openshift-multus/multus-7jcm2" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.523873 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fda9b1ff-e4a8-4d15-8f7b-2974991cd252-metrics-certs\") pod \"network-metrics-daemon-v9rdl\" (UID: \"fda9b1ff-e4a8-4d15-8f7b-2974991cd252\") " pod="openshift-multus/network-metrics-daemon-v9rdl" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.523917 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngpjd\" (UniqueName: \"kubernetes.io/projected/a5cb18bc-bda7-463e-98fe-6d8ff293b949-kube-api-access-ngpjd\") pod \"multus-additional-cni-plugins-6xvsf\" (UID: \"a5cb18bc-bda7-463e-98fe-6d8ff293b949\") " pod="openshift-multus/multus-additional-cni-plugins-6xvsf" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.523950 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-host-cni-netd\") pod \"ovnkube-node-cpfzk\" (UID: \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.523983 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/cd9b1743-6b69-46d3-a429-6f83bf43317a-host-run-k8s-cni-cncf-io\") pod \"multus-7jcm2\" (UID: \"cd9b1743-6b69-46d3-a429-6f83bf43317a\") " pod="openshift-multus/multus-7jcm2" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.524013 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cd9b1743-6b69-46d3-a429-6f83bf43317a-system-cni-dir\") pod \"multus-7jcm2\" (UID: \"cd9b1743-6b69-46d3-a429-6f83bf43317a\") " pod="openshift-multus/multus-7jcm2" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.524234 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cd9b1743-6b69-46d3-a429-6f83bf43317a-cni-binary-copy\") pod \"multus-7jcm2\" (UID: \"cd9b1743-6b69-46d3-a429-6f83bf43317a\") " pod="openshift-multus/multus-7jcm2" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.524262 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/cea46fe2-4e41-43ab-a069-cb30fb4e732c-rootfs\") pod \"machine-config-daemon-7r9cg\" (UID: \"cea46fe2-4e41-43ab-a069-cb30fb4e732c\") " pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.524304 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cea46fe2-4e41-43ab-a069-cb30fb4e732c-proxy-tls\") pod \"machine-config-daemon-7r9cg\" (UID: \"cea46fe2-4e41-43ab-a069-cb30fb4e732c\") " pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.524333 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-run-ovn\") pod \"ovnkube-node-cpfzk\" (UID: \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.524346 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cd9b1743-6b69-46d3-a429-6f83bf43317a-os-release\") pod \"multus-7jcm2\" (UID: \"cd9b1743-6b69-46d3-a429-6f83bf43317a\") " pod="openshift-multus/multus-7jcm2" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.524399 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/08dfc393-0ddb-4bde-9b1f-2a48549f4549-ovn-node-metrics-cert\") pod \"ovnkube-node-cpfzk\" (UID: \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.524458 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-cpfzk\" (UID: \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.524490 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/08dfc393-0ddb-4bde-9b1f-2a48549f4549-ovnkube-script-lib\") pod \"ovnkube-node-cpfzk\" (UID: \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.524521 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cd9b1743-6b69-46d3-a429-6f83bf43317a-multus-cni-dir\") pod \"multus-7jcm2\" (UID: \"cd9b1743-6b69-46d3-a429-6f83bf43317a\") " pod="openshift-multus/multus-7jcm2" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.524557 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5ntg\" (UniqueName: \"kubernetes.io/projected/cea46fe2-4e41-43ab-a069-cb30fb4e732c-kube-api-access-j5ntg\") pod \"machine-config-daemon-7r9cg\" (UID: \"cea46fe2-4e41-43ab-a069-cb30fb4e732c\") " pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.524596 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2521a678-ad6c-464b-bf7b-c4f6237c2822-env-overrides\") pod \"ovnkube-control-plane-749d76644c-jrr5c\" (UID: \"2521a678-ad6c-464b-bf7b-c4f6237c2822\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jrr5c" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.524625 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-node-log\") pod \"ovnkube-node-cpfzk\" (UID: \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.524662 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x266z\" (UniqueName: \"kubernetes.io/projected/cd9b1743-6b69-46d3-a429-6f83bf43317a-kube-api-access-x266z\") pod \"multus-7jcm2\" (UID: \"cd9b1743-6b69-46d3-a429-6f83bf43317a\") " pod="openshift-multus/multus-7jcm2" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.524727 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7m8g7\" (UniqueName: \"kubernetes.io/projected/2521a678-ad6c-464b-bf7b-c4f6237c2822-kube-api-access-7m8g7\") pod \"ovnkube-control-plane-749d76644c-jrr5c\" (UID: \"2521a678-ad6c-464b-bf7b-c4f6237c2822\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jrr5c" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.524807 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/08dfc393-0ddb-4bde-9b1f-2a48549f4549-env-overrides\") pod \"ovnkube-node-cpfzk\" (UID: \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.524936 4685 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.524960 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.524980 4685 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.524997 4685 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.525016 4685 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.525034 4685 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.525051 4685 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.525069 4685 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.525086 4685 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.525112 4685 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.525129 4685 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.525147 4685 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.525163 4685 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.525188 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/cd9b1743-6b69-46d3-a429-6f83bf43317a-host-run-k8s-cni-cncf-io\") pod \"multus-7jcm2\" (UID: \"cd9b1743-6b69-46d3-a429-6f83bf43317a\") " pod="openshift-multus/multus-7jcm2" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.525201 4685 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.525204 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-log-socket\") pod \"ovnkube-node-cpfzk\" (UID: \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.525160 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/08dfc393-0ddb-4bde-9b1f-2a48549f4549-ovnkube-config\") pod \"ovnkube-node-cpfzk\" (UID: \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.525304 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cd9b1743-6b69-46d3-a429-6f83bf43317a-multus-cni-dir\") pod \"multus-7jcm2\" (UID: \"cd9b1743-6b69-46d3-a429-6f83bf43317a\") " pod="openshift-multus/multus-7jcm2" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.525284 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-run-openvswitch\") pod \"ovnkube-node-cpfzk\" (UID: \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.525344 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-run-ovn\") pod \"ovnkube-node-cpfzk\" (UID: \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.525358 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a5cb18bc-bda7-463e-98fe-6d8ff293b949-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6xvsf\" (UID: \"a5cb18bc-bda7-463e-98fe-6d8ff293b949\") " pod="openshift-multus/multus-additional-cni-plugins-6xvsf" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.525432 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4dea34cf-d6c6-42fd-b4aa-8e175c6a78f6-serviceca\") pod \"node-ca-ztl6v\" (UID: \"4dea34cf-d6c6-42fd-b4aa-8e175c6a78f6\") " pod="openshift-image-registry/node-ca-ztl6v" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.523299 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-var-lib-openvswitch\") pod \"ovnkube-node-cpfzk\" (UID: \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.525870 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/cea46fe2-4e41-43ab-a069-cb30fb4e732c-rootfs\") pod \"machine-config-daemon-7r9cg\" (UID: \"cea46fe2-4e41-43ab-a069-cb30fb4e732c\") " pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.526119 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cd9b1743-6b69-46d3-a429-6f83bf43317a-host-var-lib-cni-bin\") pod \"multus-7jcm2\" (UID: \"cd9b1743-6b69-46d3-a429-6f83bf43317a\") " pod="openshift-multus/multus-7jcm2" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.526136 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cd9b1743-6b69-46d3-a429-6f83bf43317a-multus-conf-dir\") pod \"multus-7jcm2\" (UID: \"cd9b1743-6b69-46d3-a429-6f83bf43317a\") " pod="openshift-multus/multus-7jcm2" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.526156 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-host-run-ovn-kubernetes\") pod \"ovnkube-node-cpfzk\" (UID: \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.526162 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-host-cni-bin\") pod \"ovnkube-node-cpfzk\" (UID: \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.526171 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/cd9b1743-6b69-46d3-a429-6f83bf43317a-hostroot\") pod \"multus-7jcm2\" (UID: \"cd9b1743-6b69-46d3-a429-6f83bf43317a\") " pod="openshift-multus/multus-7jcm2" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.526207 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-node-log\") pod \"ovnkube-node-cpfzk\" (UID: \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.526186 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-cpfzk\" (UID: \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.526476 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-systemd-units\") pod \"ovnkube-node-cpfzk\" (UID: \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.526511 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-host-cni-netd\") pod \"ovnkube-node-cpfzk\" (UID: \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.526577 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a5cb18bc-bda7-463e-98fe-6d8ff293b949-system-cni-dir\") pod \"multus-additional-cni-plugins-6xvsf\" (UID: \"a5cb18bc-bda7-463e-98fe-6d8ff293b949\") " pod="openshift-multus/multus-additional-cni-plugins-6xvsf" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.526620 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cd9b1743-6b69-46d3-a429-6f83bf43317a-etc-kubernetes\") pod \"multus-7jcm2\" (UID: \"cd9b1743-6b69-46d3-a429-6f83bf43317a\") " pod="openshift-multus/multus-7jcm2" Mar 21 03:47:39 crc kubenswrapper[4685]: E0321 03:47:39.526690 4685 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 21 03:47:39 crc kubenswrapper[4685]: E0321 03:47:39.526880 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fda9b1ff-e4a8-4d15-8f7b-2974991cd252-metrics-certs podName:fda9b1ff-e4a8-4d15-8f7b-2974991cd252 nodeName:}" failed. No retries permitted until 2026-03-21 03:47:40.026853904 +0000 UTC m=+92.503922876 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fda9b1ff-e4a8-4d15-8f7b-2974991cd252-metrics-certs") pod "network-metrics-daemon-v9rdl" (UID: "fda9b1ff-e4a8-4d15-8f7b-2974991cd252") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.527534 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/08dfc393-0ddb-4bde-9b1f-2a48549f4549-ovnkube-script-lib\") pod \"ovnkube-node-cpfzk\" (UID: \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.527668 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a5cb18bc-bda7-463e-98fe-6d8ff293b949-cni-binary-copy\") pod \"multus-additional-cni-plugins-6xvsf\" (UID: \"a5cb18bc-bda7-463e-98fe-6d8ff293b949\") " pod="openshift-multus/multus-additional-cni-plugins-6xvsf" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.527681 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.527888 4685 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.527913 4685 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.527930 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.527944 4685 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.527957 4685 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.527971 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.527983 4685 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.527995 4685 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.528008 4685 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.528022 4685 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.528036 4685 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.528049 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.528062 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.528074 4685 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.528086 4685 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.528099 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.528111 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.528123 4685 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.528136 4685 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.528139 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/cd9b1743-6b69-46d3-a429-6f83bf43317a-multus-daemon-config\") pod \"multus-7jcm2\" (UID: \"cd9b1743-6b69-46d3-a429-6f83bf43317a\") " pod="openshift-multus/multus-7jcm2" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.528149 4685 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.528222 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.528236 4685 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.528247 4685 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.528283 4685 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.528294 4685 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.528312 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.528322 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.528330 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.528362 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.528372 4685 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.528382 4685 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.528391 4685 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.528400 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.528409 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.528440 4685 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.528454 4685 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.528463 4685 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.528471 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.528480 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.528489 4685 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.528523 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.528533 4685 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.528542 4685 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.528551 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.528559 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.528568 4685 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.528598 4685 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.528608 4685 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.528620 4685 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.528628 4685 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.528636 4685 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.528646 4685 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.528676 4685 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.528688 4685 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.528697 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.528706 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.528713 4685 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.528722 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.528730 4685 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.528761 4685 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.528770 4685 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.528780 4685 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.528788 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.528797 4685 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.528804 4685 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.528813 4685 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.528852 4685 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.528862 4685 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.528870 4685 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.528878 4685 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.528887 4685 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.528918 4685 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.528928 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.528937 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.528945 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.528956 4685 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.528970 4685 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.529000 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.529011 4685 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.529019 4685 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.529027 4685 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.529038 4685 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.529046 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.529082 4685 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.529334 4685 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.529343 4685 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.529352 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.529383 4685 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.529392 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.529401 4685 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.529409 4685 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.529417 4685 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.529426 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.529433 4685 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.529469 4685 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.529479 4685 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.529490 4685 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.529505 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.529515 4685 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.529546 4685 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.529555 4685 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.529564 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.529572 4685 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.529581 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.529589 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.529618 4685 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.529628 4685 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.529635 4685 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.529644 4685 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.529651 4685 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.529660 4685 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.529668 4685 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.529676 4685 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.529706 4685 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.529715 4685 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.529725 4685 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.529733 4685 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.529741 4685 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.529748 4685 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.529778 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.529788 4685 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.529796 4685 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.529805 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.529813 4685 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.529821 4685 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.529858 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.529868 4685 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.529876 4685 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.529883 4685 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.529897 4685 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.529905 4685 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.529913 4685 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.529948 4685 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.529965 4685 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.529974 4685 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.529982 4685 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.529990 4685 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.530015 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.530024 4685 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.530032 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.530040 4685 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.530050 4685 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.530058 4685 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.530066 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.530074 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.530120 4685 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.530130 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.530138 4685 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.530146 4685 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.530154 4685 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.530162 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.530170 4685 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.531184 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2521a678-ad6c-464b-bf7b-c4f6237c2822-env-overrides\") pod \"ovnkube-control-plane-749d76644c-jrr5c\" (UID: \"2521a678-ad6c-464b-bf7b-c4f6237c2822\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jrr5c" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.535662 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a5cb18bc-bda7-463e-98fe-6d8ff293b949-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6xvsf\" (UID: \"a5cb18bc-bda7-463e-98fe-6d8ff293b949\") " pod="openshift-multus/multus-additional-cni-plugins-6xvsf" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.536665 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2521a678-ad6c-464b-bf7b-c4f6237c2822-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-jrr5c\" (UID: \"2521a678-ad6c-464b-bf7b-c4f6237c2822\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jrr5c" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.537360 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/08dfc393-0ddb-4bde-9b1f-2a48549f4549-ovn-node-metrics-cert\") pod \"ovnkube-node-cpfzk\" (UID: \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.538010 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cea46fe2-4e41-43ab-a069-cb30fb4e732c-mcd-auth-proxy-config\") pod \"machine-config-daemon-7r9cg\" (UID: \"cea46fe2-4e41-43ab-a069-cb30fb4e732c\") " pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.540940 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cea46fe2-4e41-43ab-a069-cb30fb4e732c-proxy-tls\") pod \"machine-config-daemon-7r9cg\" (UID: \"cea46fe2-4e41-43ab-a069-cb30fb4e732c\") " pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.526230 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/cd9b1743-6b69-46d3-a429-6f83bf43317a-host-run-multus-certs\") pod \"multus-7jcm2\" (UID: \"cd9b1743-6b69-46d3-a429-6f83bf43317a\") " pod="openshift-multus/multus-7jcm2" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.544700 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cd9b1743-6b69-46d3-a429-6f83bf43317a-system-cni-dir\") pod \"multus-7jcm2\" (UID: \"cd9b1743-6b69-46d3-a429-6f83bf43317a\") " pod="openshift-multus/multus-7jcm2" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.545139 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.545197 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.545209 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.545265 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.545280 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:39Z","lastTransitionTime":"2026-03-21T03:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.548216 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sc2qv\" (UniqueName: \"kubernetes.io/projected/4dea34cf-d6c6-42fd-b4aa-8e175c6a78f6-kube-api-access-sc2qv\") pod \"node-ca-ztl6v\" (UID: \"4dea34cf-d6c6-42fd-b4aa-8e175c6a78f6\") " pod="openshift-image-registry/node-ca-ztl6v" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.555714 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdrfx\" (UniqueName: \"kubernetes.io/projected/08dfc393-0ddb-4bde-9b1f-2a48549f4549-kube-api-access-fdrfx\") pod \"ovnkube-node-cpfzk\" (UID: \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.556459 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrczq\" (UniqueName: \"kubernetes.io/projected/fda9b1ff-e4a8-4d15-8f7b-2974991cd252-kube-api-access-wrczq\") pod \"network-metrics-daemon-v9rdl\" (UID: \"fda9b1ff-e4a8-4d15-8f7b-2974991cd252\") " pod="openshift-multus/network-metrics-daemon-v9rdl" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.563101 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tf6h\" (UniqueName: \"kubernetes.io/projected/7b79f01f-bf05-4f7d-b816-6ef01f21e949-kube-api-access-6tf6h\") pod \"node-resolver-mlsb2\" (UID: \"7b79f01f-bf05-4f7d-b816-6ef01f21e949\") " pod="openshift-dns/node-resolver-mlsb2" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.563110 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x266z\" (UniqueName: \"kubernetes.io/projected/cd9b1743-6b69-46d3-a429-6f83bf43317a-kube-api-access-x266z\") pod \"multus-7jcm2\" (UID: \"cd9b1743-6b69-46d3-a429-6f83bf43317a\") " pod="openshift-multus/multus-7jcm2" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.563891 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5ntg\" (UniqueName: \"kubernetes.io/projected/cea46fe2-4e41-43ab-a069-cb30fb4e732c-kube-api-access-j5ntg\") pod \"machine-config-daemon-7r9cg\" (UID: \"cea46fe2-4e41-43ab-a069-cb30fb4e732c\") " pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" Mar 21 03:47:39 crc kubenswrapper[4685]: E0321 03:47:39.564765 4685 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8bd8455e-cd7f-4a01-9ab2-39696fd22c82\\\",\\\"systemUUID\\\":\\\"9e08e2bc-a5ba-4ecf-a5ec-5e30e6b4a1fa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.566791 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7m8g7\" (UniqueName: \"kubernetes.io/projected/2521a678-ad6c-464b-bf7b-c4f6237c2822-kube-api-access-7m8g7\") pod \"ovnkube-control-plane-749d76644c-jrr5c\" (UID: \"2521a678-ad6c-464b-bf7b-c4f6237c2822\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jrr5c" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.568713 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngpjd\" (UniqueName: \"kubernetes.io/projected/a5cb18bc-bda7-463e-98fe-6d8ff293b949-kube-api-access-ngpjd\") pod \"multus-additional-cni-plugins-6xvsf\" (UID: \"a5cb18bc-bda7-463e-98fe-6d8ff293b949\") " pod="openshift-multus/multus-additional-cni-plugins-6xvsf" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.569528 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.569577 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.569588 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.569606 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.569619 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:39Z","lastTransitionTime":"2026-03-21T03:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:39 crc kubenswrapper[4685]: E0321 03:47:39.579265 4685 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8bd8455e-cd7f-4a01-9ab2-39696fd22c82\\\",\\\"systemUUID\\\":\\\"9e08e2bc-a5ba-4ecf-a5ec-5e30e6b4a1fa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.583151 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.583198 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.583207 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.583228 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.583240 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:39Z","lastTransitionTime":"2026-03-21T03:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:39 crc kubenswrapper[4685]: E0321 03:47:39.593894 4685 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8bd8455e-cd7f-4a01-9ab2-39696fd22c82\\\",\\\"systemUUID\\\":\\\"9e08e2bc-a5ba-4ecf-a5ec-5e30e6b4a1fa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.597483 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.597530 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.597545 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.597565 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.597589 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:39Z","lastTransitionTime":"2026-03-21T03:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.603501 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 21 03:47:39 crc kubenswrapper[4685]: E0321 03:47:39.606574 4685 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8bd8455e-cd7f-4a01-9ab2-39696fd22c82\\\",\\\"systemUUID\\\":\\\"9e08e2bc-a5ba-4ecf-a5ec-5e30e6b4a1fa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 03:47:39 crc kubenswrapper[4685]: E0321 03:47:39.606688 4685 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.608988 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.609034 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.609045 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.609063 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.609079 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:39Z","lastTransitionTime":"2026-03-21T03:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.614333 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.624569 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.639075 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-6xvsf" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.649192 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jrr5c" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.656545 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.663272 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-mlsb2" Mar 21 03:47:39 crc kubenswrapper[4685]: W0321 03:47:39.670771 4685 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5cb18bc_bda7_463e_98fe_6d8ff293b949.slice/crio-99bd7835dc9a09f49ea58809b416fdbd2dda2bdd0a8ff47cf9b6bcdbdfbe1083 WatchSource:0}: Error finding container 99bd7835dc9a09f49ea58809b416fdbd2dda2bdd0a8ff47cf9b6bcdbdfbe1083: Status 404 returned error can't find the container with id 99bd7835dc9a09f49ea58809b416fdbd2dda2bdd0a8ff47cf9b6bcdbdfbe1083 Mar 21 03:47:39 crc kubenswrapper[4685]: W0321 03:47:39.671332 4685 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2521a678_ad6c_464b_bf7b_c4f6237c2822.slice/crio-968e35499ad022ddf2268b6fab2402100e1dda688066136cdaae3192e93be0b3 WatchSource:0}: Error finding container 968e35499ad022ddf2268b6fab2402100e1dda688066136cdaae3192e93be0b3: Status 404 returned error can't find the container with id 968e35499ad022ddf2268b6fab2402100e1dda688066136cdaae3192e93be0b3 Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.671342 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-ztl6v" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.677377 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-7jcm2" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.684408 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.711920 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.712229 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.712257 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.712278 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.712295 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:39Z","lastTransitionTime":"2026-03-21T03:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:39 crc kubenswrapper[4685]: W0321 03:47:39.727455 4685 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b79f01f_bf05_4f7d_b816_6ef01f21e949.slice/crio-4c76d4d92e0e37535d5d9214daaf3bdf706252e90bf99e013a9d6ff7a97f3c82 WatchSource:0}: Error finding container 4c76d4d92e0e37535d5d9214daaf3bdf706252e90bf99e013a9d6ff7a97f3c82: Status 404 returned error can't find the container with id 4c76d4d92e0e37535d5d9214daaf3bdf706252e90bf99e013a9d6ff7a97f3c82 Mar 21 03:47:39 crc kubenswrapper[4685]: W0321 03:47:39.755632 4685 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcea46fe2_4e41_43ab_a069_cb30fb4e732c.slice/crio-acf396b1b296ae8502009f9453d8b8fc9cb1426afddd92a1213bf29bb75a27b2 WatchSource:0}: Error finding container acf396b1b296ae8502009f9453d8b8fc9cb1426afddd92a1213bf29bb75a27b2: Status 404 returned error can't find the container with id acf396b1b296ae8502009f9453d8b8fc9cb1426afddd92a1213bf29bb75a27b2 Mar 21 03:47:39 crc kubenswrapper[4685]: W0321 03:47:39.763876 4685 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd9b1743_6b69_46d3_a429_6f83bf43317a.slice/crio-367c1a0a5a5f3a08afcb30b2da6ea8694ddcbaccdcf4039188e497681b13460b WatchSource:0}: Error finding container 367c1a0a5a5f3a08afcb30b2da6ea8694ddcbaccdcf4039188e497681b13460b: Status 404 returned error can't find the container with id 367c1a0a5a5f3a08afcb30b2da6ea8694ddcbaccdcf4039188e497681b13460b Mar 21 03:47:39 crc kubenswrapper[4685]: W0321 03:47:39.780124 4685 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4dea34cf_d6c6_42fd_b4aa_8e175c6a78f6.slice/crio-70fa48445f1498aad449746f768f1af4c63111aa2294f73003756ec41187b553 WatchSource:0}: Error finding container 70fa48445f1498aad449746f768f1af4c63111aa2294f73003756ec41187b553: Status 404 returned error can't find the container with id 70fa48445f1498aad449746f768f1af4c63111aa2294f73003756ec41187b553 Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.814569 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.814606 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.814620 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.814640 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.814652 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:39Z","lastTransitionTime":"2026-03-21T03:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.874629 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" event={"ID":"cea46fe2-4e41-43ab-a069-cb30fb4e732c","Type":"ContainerStarted","Data":"acf396b1b296ae8502009f9453d8b8fc9cb1426afddd92a1213bf29bb75a27b2"} Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.875733 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"5d981cec0491154ace10ce4f437d1101c8dd69c34129a42b7dd5af226f8b14b4"} Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.875760 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"7977e5d7e135598a9fd86e69262804c7529015f7b7e3747ed1595cdab676c3a5"} Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.882110 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"09ec8b042a09c11bb910f30e1410ebc4fb6df3352b4f6b910d8c2d5bd1f2a31d"} Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.894568 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7jcm2" event={"ID":"cd9b1743-6b69-46d3-a429-6f83bf43317a","Type":"ContainerStarted","Data":"367c1a0a5a5f3a08afcb30b2da6ea8694ddcbaccdcf4039188e497681b13460b"} Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.898410 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6xvsf" event={"ID":"a5cb18bc-bda7-463e-98fe-6d8ff293b949","Type":"ContainerStarted","Data":"99bd7835dc9a09f49ea58809b416fdbd2dda2bdd0a8ff47cf9b6bcdbdfbe1083"} Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.917281 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.917374 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.917391 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.917410 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.917422 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:39Z","lastTransitionTime":"2026-03-21T03:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.917706 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-mlsb2" event={"ID":"7b79f01f-bf05-4f7d-b816-6ef01f21e949","Type":"ContainerStarted","Data":"4c76d4d92e0e37535d5d9214daaf3bdf706252e90bf99e013a9d6ff7a97f3c82"} Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.920501 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"125cd548b7255e1bf57d68a7323c8a8a697d12a8585106ec9156c16f595c3ee8"} Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.922298 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-ztl6v" event={"ID":"4dea34cf-d6c6-42fd-b4aa-8e175c6a78f6","Type":"ContainerStarted","Data":"70fa48445f1498aad449746f768f1af4c63111aa2294f73003756ec41187b553"} Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.923886 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" event={"ID":"08dfc393-0ddb-4bde-9b1f-2a48549f4549","Type":"ContainerStarted","Data":"7dd5895cbe204199a83f73c83d8382055e46c7a26e8e16240fe09e92845509cf"} Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.926346 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jrr5c" event={"ID":"2521a678-ad6c-464b-bf7b-c4f6237c2822","Type":"ContainerStarted","Data":"968e35499ad022ddf2268b6fab2402100e1dda688066136cdaae3192e93be0b3"} Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.933095 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.933259 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 03:47:39 crc kubenswrapper[4685]: E0321 03:47:39.933727 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 03:47:40.93366778 +0000 UTC m=+93.410736572 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:47:39 crc kubenswrapper[4685]: E0321 03:47:39.934335 4685 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 21 03:47:39 crc kubenswrapper[4685]: E0321 03:47:39.934386 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 03:47:40.934376261 +0000 UTC m=+93.411445053 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.935341 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.950490 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.965227 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.976299 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 03:47:39 crc kubenswrapper[4685]: I0321 03:47:39.988737 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-v9rdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fda9b1ff-e4a8-4d15-8f7b-2974991cd252\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrczq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrczq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-v9rdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.005152 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08dfc393-0ddb-4bde-9b1f-2a48549f4549\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cpfzk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.016052 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mlsb2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b79f01f-bf05-4f7d-b816-6ef01f21e949\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6tf6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mlsb2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.023604 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.023664 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.023676 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.023695 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.024621 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:40Z","lastTransitionTime":"2026-03-21T03:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.026737 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ztl6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dea34cf-d6c6-42fd-b4aa-8e175c6a78f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sc2qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ztl6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.036020 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.036089 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fda9b1ff-e4a8-4d15-8f7b-2974991cd252-metrics-certs\") pod \"network-metrics-daemon-v9rdl\" (UID: \"fda9b1ff-e4a8-4d15-8f7b-2974991cd252\") " pod="openshift-multus/network-metrics-daemon-v9rdl" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.036188 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.036218 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 03:47:40 crc kubenswrapper[4685]: E0321 03:47:40.036436 4685 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 21 03:47:40 crc kubenswrapper[4685]: E0321 03:47:40.036509 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 03:47:41.036490532 +0000 UTC m=+93.513559324 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 21 03:47:40 crc kubenswrapper[4685]: E0321 03:47:40.037309 4685 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 21 03:47:40 crc kubenswrapper[4685]: E0321 03:47:40.037383 4685 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 21 03:47:40 crc kubenswrapper[4685]: E0321 03:47:40.038352 4685 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 21 03:47:40 crc kubenswrapper[4685]: E0321 03:47:40.038504 4685 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 03:47:40 crc kubenswrapper[4685]: E0321 03:47:40.037425 4685 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 21 03:47:40 crc kubenswrapper[4685]: E0321 03:47:40.038568 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fda9b1ff-e4a8-4d15-8f7b-2974991cd252-metrics-certs podName:fda9b1ff-e4a8-4d15-8f7b-2974991cd252 nodeName:}" failed. No retries permitted until 2026-03-21 03:47:41.038521781 +0000 UTC m=+93.515590573 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fda9b1ff-e4a8-4d15-8f7b-2974991cd252-metrics-certs") pod "network-metrics-daemon-v9rdl" (UID: "fda9b1ff-e4a8-4d15-8f7b-2974991cd252") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 21 03:47:40 crc kubenswrapper[4685]: E0321 03:47:40.038806 4685 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 21 03:47:40 crc kubenswrapper[4685]: E0321 03:47:40.038992 4685 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.039197 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 03:47:40 crc kubenswrapper[4685]: E0321 03:47:40.040813 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-21 03:47:41.039202931 +0000 UTC m=+93.516271723 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 03:47:40 crc kubenswrapper[4685]: E0321 03:47:40.040879 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-21 03:47:41.040858249 +0000 UTC m=+93.517927041 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.049105 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jrr5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2521a678-ad6c-464b-bf7b-c4f6237c2822\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jrr5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.062210 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6xvsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5cb18bc-bda7-463e-98fe-6d8ff293b949\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6xvsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.072109 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7jcm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd9b1743-6b69-46d3-a429-6f83bf43317a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x266z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7jcm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.083268 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cea46fe2-4e41-43ab-a069-cb30fb4e732c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5ntg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5ntg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7r9cg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.095420 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.128269 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.128317 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.128329 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.128364 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.128379 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:40Z","lastTransitionTime":"2026-03-21T03:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.230927 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.230970 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.230980 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.230999 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.231013 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:40Z","lastTransitionTime":"2026-03-21T03:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.305299 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.306249 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.307064 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.308738 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.309444 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.310562 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.311201 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.311930 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.313025 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.313581 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.314477 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.315521 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.316620 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.317245 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.318268 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.318803 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.319376 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.320216 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.320759 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.321951 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.322667 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.323761 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.324939 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.325912 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.327080 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.327958 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.328909 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.329524 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.330258 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.330746 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.331314 4685 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.331419 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.333034 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.334387 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.334433 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.334447 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.334475 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.334490 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:40Z","lastTransitionTime":"2026-03-21T03:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.335480 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.335901 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.337514 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.338582 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.339221 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.340277 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.340951 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.341764 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.342508 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.343566 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.344262 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.345178 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.345761 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.346789 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.347539 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.348433 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.348986 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.349979 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.350668 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.351272 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.352105 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.437561 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.437612 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.437626 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.437647 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.437661 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:40Z","lastTransitionTime":"2026-03-21T03:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.540906 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.540963 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.540978 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.541000 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.541013 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:40Z","lastTransitionTime":"2026-03-21T03:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.644237 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.644303 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.644322 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.644347 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.644366 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:40Z","lastTransitionTime":"2026-03-21T03:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.747178 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.747245 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.747264 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.747292 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.747311 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:40Z","lastTransitionTime":"2026-03-21T03:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.849748 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.849832 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.849878 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.849895 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.849908 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:40Z","lastTransitionTime":"2026-03-21T03:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.933071 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"7aab80bc7857e9a3d86f540c725f4e35d0d86f979abed1ea5c9cddb4a6263924"} Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.935271 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"df1f675def99930733ba3d2e467e3c07307a00cd8a10f0c3dee668de18f92dbb"} Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.937375 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-ztl6v" event={"ID":"4dea34cf-d6c6-42fd-b4aa-8e175c6a78f6","Type":"ContainerStarted","Data":"438ee4ce0e664eb23ee83b20eecbc81dd4f5ad12e50e98785be3696b58528952"} Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.940167 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7jcm2" event={"ID":"cd9b1743-6b69-46d3-a429-6f83bf43317a","Type":"ContainerStarted","Data":"a947393ce78ff4c9dddd93eb5ee7cb034673ebb6f0391cb868b473028074aa46"} Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.942329 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" event={"ID":"cea46fe2-4e41-43ab-a069-cb30fb4e732c","Type":"ContainerStarted","Data":"2f8af9aaf0940271c159c00b04ffb7a1dd9471caa93ad4c7bd4507461a095923"} Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.942375 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" event={"ID":"cea46fe2-4e41-43ab-a069-cb30fb4e732c","Type":"ContainerStarted","Data":"682dda84970818843427fd441cf0359877cff12a6a10a7f7fad13d60c5ca62f3"} Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.944586 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.944873 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 03:47:40 crc kubenswrapper[4685]: E0321 03:47:40.944917 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 03:47:42.944876215 +0000 UTC m=+95.421945107 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:47:40 crc kubenswrapper[4685]: E0321 03:47:40.945051 4685 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 21 03:47:40 crc kubenswrapper[4685]: E0321 03:47:40.945137 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 03:47:42.945110682 +0000 UTC m=+95.422179624 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.946345 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-mlsb2" event={"ID":"7b79f01f-bf05-4f7d-b816-6ef01f21e949","Type":"ContainerStarted","Data":"415d35423a66454f5ed06ba8facb7adbda91fae370e4d1d8a022cf7002e7b8bb"} Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.949011 4685 generic.go:334] "Generic (PLEG): container finished" podID="08dfc393-0ddb-4bde-9b1f-2a48549f4549" containerID="246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a" exitCode=0 Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.949145 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" event={"ID":"08dfc393-0ddb-4bde-9b1f-2a48549f4549","Type":"ContainerDied","Data":"246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a"} Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.952582 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cea46fe2-4e41-43ab-a069-cb30fb4e732c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5ntg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5ntg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7r9cg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.954087 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jrr5c" event={"ID":"2521a678-ad6c-464b-bf7b-c4f6237c2822","Type":"ContainerStarted","Data":"6bc29c132abc0c440dd96bf14118de27e46aa41322e61ce3d685eb43962330cc"} Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.954224 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jrr5c" event={"ID":"2521a678-ad6c-464b-bf7b-c4f6237c2822","Type":"ContainerStarted","Data":"749ea865e46b19e2475b94b696bb2be8bc27add3f2ba2420c38b9d67b2b8ddd1"} Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.957045 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.957093 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.957110 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.957134 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.957151 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:40Z","lastTransitionTime":"2026-03-21T03:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.959407 4685 generic.go:334] "Generic (PLEG): container finished" podID="a5cb18bc-bda7-463e-98fe-6d8ff293b949" containerID="b4d8efc9f2ad837ea5b63d39bb04316943c7562b645f8f78b12e54d430f2a4a3" exitCode=0 Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.959513 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6xvsf" event={"ID":"a5cb18bc-bda7-463e-98fe-6d8ff293b949","Type":"ContainerDied","Data":"b4d8efc9f2ad837ea5b63d39bb04316943c7562b645f8f78b12e54d430f2a4a3"} Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.971949 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aab80bc7857e9a3d86f540c725f4e35d0d86f979abed1ea5c9cddb4a6263924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d981cec0491154ace10ce4f437d1101c8dd69c34129a42b7dd5af226f8b14b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 03:47:40 crc kubenswrapper[4685]: I0321 03:47:40.987859 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6xvsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5cb18bc-bda7-463e-98fe-6d8ff293b949\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6xvsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.006480 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7jcm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd9b1743-6b69-46d3-a429-6f83bf43317a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x266z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7jcm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.017688 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.035330 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.045745 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.045780 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.045857 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.045887 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fda9b1ff-e4a8-4d15-8f7b-2974991cd252-metrics-certs\") pod \"network-metrics-daemon-v9rdl\" (UID: \"fda9b1ff-e4a8-4d15-8f7b-2974991cd252\") " pod="openshift-multus/network-metrics-daemon-v9rdl" Mar 21 03:47:41 crc kubenswrapper[4685]: E0321 03:47:41.046668 4685 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 21 03:47:41 crc kubenswrapper[4685]: E0321 03:47:41.046707 4685 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 21 03:47:41 crc kubenswrapper[4685]: E0321 03:47:41.046721 4685 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 03:47:41 crc kubenswrapper[4685]: E0321 03:47:41.046778 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-21 03:47:43.046755699 +0000 UTC m=+95.523824491 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 03:47:41 crc kubenswrapper[4685]: E0321 03:47:41.047169 4685 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 21 03:47:41 crc kubenswrapper[4685]: E0321 03:47:41.047230 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 03:47:43.047211863 +0000 UTC m=+95.524280655 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 21 03:47:41 crc kubenswrapper[4685]: E0321 03:47:41.048242 4685 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 21 03:47:41 crc kubenswrapper[4685]: E0321 03:47:41.048265 4685 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 21 03:47:41 crc kubenswrapper[4685]: E0321 03:47:41.048280 4685 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 03:47:41 crc kubenswrapper[4685]: E0321 03:47:41.048306 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-21 03:47:43.048297284 +0000 UTC m=+95.525366066 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 03:47:41 crc kubenswrapper[4685]: E0321 03:47:41.048588 4685 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 21 03:47:41 crc kubenswrapper[4685]: E0321 03:47:41.048618 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fda9b1ff-e4a8-4d15-8f7b-2974991cd252-metrics-certs podName:fda9b1ff-e4a8-4d15-8f7b-2974991cd252 nodeName:}" failed. No retries permitted until 2026-03-21 03:47:43.048609523 +0000 UTC m=+95.525678315 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fda9b1ff-e4a8-4d15-8f7b-2974991cd252-metrics-certs") pod "network-metrics-daemon-v9rdl" (UID: "fda9b1ff-e4a8-4d15-8f7b-2974991cd252") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.055477 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.061751 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.061831 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.061879 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.061910 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.061930 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:41Z","lastTransitionTime":"2026-03-21T03:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.067281 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-v9rdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fda9b1ff-e4a8-4d15-8f7b-2974991cd252\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrczq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrczq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-v9rdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.083383 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08dfc393-0ddb-4bde-9b1f-2a48549f4549\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cpfzk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.096720 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mlsb2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b79f01f-bf05-4f7d-b816-6ef01f21e949\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6tf6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mlsb2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.112797 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:41Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.123689 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ztl6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dea34cf-d6c6-42fd-b4aa-8e175c6a78f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sc2qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ztl6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:41Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.139933 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:41Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.151017 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jrr5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2521a678-ad6c-464b-bf7b-c4f6237c2822\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jrr5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:41Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.163219 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aab80bc7857e9a3d86f540c725f4e35d0d86f979abed1ea5c9cddb4a6263924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d981cec0491154ace10ce4f437d1101c8dd69c34129a42b7dd5af226f8b14b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:41Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.167097 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.167168 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.167187 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.167215 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.167232 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:41Z","lastTransitionTime":"2026-03-21T03:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.181768 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6xvsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5cb18bc-bda7-463e-98fe-6d8ff293b949\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d8efc9f2ad837ea5b63d39bb04316943c7562b645f8f78b12e54d430f2a4a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4d8efc9f2ad837ea5b63d39bb04316943c7562b645f8f78b12e54d430f2a4a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6xvsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:41Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.193750 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7jcm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd9b1743-6b69-46d3-a429-6f83bf43317a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a947393ce78ff4c9dddd93eb5ee7cb034673ebb6f0391cb868b473028074aa46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x266z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7jcm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:41Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.210710 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cea46fe2-4e41-43ab-a069-cb30fb4e732c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f8af9aaf0940271c159c00b04ffb7a1dd9471caa93ad4c7bd4507461a095923\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5ntg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://682dda84970818843427fd441cf0359877cff12a6a10a7f7fad13d60c5ca62f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5ntg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7r9cg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:41Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.230646 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:41Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.246325 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:41Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.258726 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-v9rdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fda9b1ff-e4a8-4d15-8f7b-2974991cd252\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrczq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrczq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-v9rdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:41Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.269764 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.269814 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.269827 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.269867 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.269881 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:41Z","lastTransitionTime":"2026-03-21T03:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.284765 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08dfc393-0ddb-4bde-9b1f-2a48549f4549\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cpfzk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:41Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.296908 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mlsb2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b79f01f-bf05-4f7d-b816-6ef01f21e949\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://415d35423a66454f5ed06ba8facb7adbda91fae370e4d1d8a022cf7002e7b8bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6tf6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mlsb2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:41Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.299932 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v9rdl" Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.299947 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.299946 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.299927 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 03:47:41 crc kubenswrapper[4685]: E0321 03:47:41.300057 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v9rdl" podUID="fda9b1ff-e4a8-4d15-8f7b-2974991cd252" Mar 21 03:47:41 crc kubenswrapper[4685]: E0321 03:47:41.300203 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 03:47:41 crc kubenswrapper[4685]: E0321 03:47:41.300282 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 03:47:41 crc kubenswrapper[4685]: E0321 03:47:41.300313 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.313456 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df1f675def99930733ba3d2e467e3c07307a00cd8a10f0c3dee668de18f92dbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:41Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.328395 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:41Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.341464 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ztl6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dea34cf-d6c6-42fd-b4aa-8e175c6a78f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438ee4ce0e664eb23ee83b20eecbc81dd4f5ad12e50e98785be3696b58528952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sc2qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ztl6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:41Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.352851 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:41Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.365930 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jrr5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2521a678-ad6c-464b-bf7b-c4f6237c2822\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://749ea865e46b19e2475b94b696bb2be8bc27add3f2ba2420c38b9d67b2b8ddd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc29c132abc0c440dd96bf14118de27e46aa41322e61ce3d685eb43962330cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jrr5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:41Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.372597 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.372636 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.372647 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.372665 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.372676 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:41Z","lastTransitionTime":"2026-03-21T03:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.475972 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.476439 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.476448 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.476469 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.476482 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:41Z","lastTransitionTime":"2026-03-21T03:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.586571 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.586629 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.586642 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.586874 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.586891 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:41Z","lastTransitionTime":"2026-03-21T03:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.690386 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.690422 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.690431 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.690447 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.690457 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:41Z","lastTransitionTime":"2026-03-21T03:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.793017 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.793072 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.793084 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.793106 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.793121 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:41Z","lastTransitionTime":"2026-03-21T03:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.896222 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.896255 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.896264 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.896287 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.896297 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:41Z","lastTransitionTime":"2026-03-21T03:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.965393 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6xvsf" event={"ID":"a5cb18bc-bda7-463e-98fe-6d8ff293b949","Type":"ContainerStarted","Data":"8d2cd7978a2b2d5a571096a1f0172b76414dadc75fb54395e05d1f344f04e1a0"} Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.972618 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" event={"ID":"08dfc393-0ddb-4bde-9b1f-2a48549f4549","Type":"ContainerStarted","Data":"2215b819a38926d6f34fede86a32762a942f00bc552ca9c0b8df5f17ed47c10a"} Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.972675 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" event={"ID":"08dfc393-0ddb-4bde-9b1f-2a48549f4549","Type":"ContainerStarted","Data":"beb9850d14f3c89154f1960550eed4073643c6b70832bb3958176ac924f5a9fa"} Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.972692 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" event={"ID":"08dfc393-0ddb-4bde-9b1f-2a48549f4549","Type":"ContainerStarted","Data":"a4ad07f9e8bc73e15364d59e6baf04c54550518537bac8cbfcf180b423a1da4f"} Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.972729 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" event={"ID":"08dfc393-0ddb-4bde-9b1f-2a48549f4549","Type":"ContainerStarted","Data":"d89981c8d56dfb581289dc794c8402f5fde866fd6948c2c0a1940eb3ba99a320"} Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.972747 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" event={"ID":"08dfc393-0ddb-4bde-9b1f-2a48549f4549","Type":"ContainerStarted","Data":"473ea0e8600e10ea2a695b213d69795c56c1a399cbc5dc82ddb34f7261a034e5"} Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.983169 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7jcm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd9b1743-6b69-46d3-a429-6f83bf43317a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a947393ce78ff4c9dddd93eb5ee7cb034673ebb6f0391cb868b473028074aa46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x266z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7jcm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:41Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.994962 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cea46fe2-4e41-43ab-a069-cb30fb4e732c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f8af9aaf0940271c159c00b04ffb7a1dd9471caa93ad4c7bd4507461a095923\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5ntg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://682dda84970818843427fd441cf0359877cff12a6a10a7f7fad13d60c5ca62f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5ntg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7r9cg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:41Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.999432 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.999477 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.999488 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.999507 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:41 crc kubenswrapper[4685]: I0321 03:47:41.999518 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:41Z","lastTransitionTime":"2026-03-21T03:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:42 crc kubenswrapper[4685]: I0321 03:47:42.010405 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aab80bc7857e9a3d86f540c725f4e35d0d86f979abed1ea5c9cddb4a6263924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d981cec0491154ace10ce4f437d1101c8dd69c34129a42b7dd5af226f8b14b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:42Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:42 crc kubenswrapper[4685]: I0321 03:47:42.027111 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6xvsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5cb18bc-bda7-463e-98fe-6d8ff293b949\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d8efc9f2ad837ea5b63d39bb04316943c7562b645f8f78b12e54d430f2a4a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4d8efc9f2ad837ea5b63d39bb04316943c7562b645f8f78b12e54d430f2a4a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d2cd7978a2b2d5a571096a1f0172b76414dadc75fb54395e05d1f344f04e1a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6xvsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:42Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:42 crc kubenswrapper[4685]: I0321 03:47:42.040604 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:42Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:42 crc kubenswrapper[4685]: I0321 03:47:42.057604 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df1f675def99930733ba3d2e467e3c07307a00cd8a10f0c3dee668de18f92dbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:42Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:42 crc kubenswrapper[4685]: I0321 03:47:42.070205 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:42Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:42 crc kubenswrapper[4685]: I0321 03:47:42.082895 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:42Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:42 crc kubenswrapper[4685]: I0321 03:47:42.094605 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-v9rdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fda9b1ff-e4a8-4d15-8f7b-2974991cd252\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrczq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrczq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-v9rdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:42Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:42 crc kubenswrapper[4685]: I0321 03:47:42.102423 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:42 crc kubenswrapper[4685]: I0321 03:47:42.102486 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:42 crc kubenswrapper[4685]: I0321 03:47:42.102506 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:42 crc kubenswrapper[4685]: I0321 03:47:42.102531 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:42 crc kubenswrapper[4685]: I0321 03:47:42.102548 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:42Z","lastTransitionTime":"2026-03-21T03:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:42 crc kubenswrapper[4685]: I0321 03:47:42.119815 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08dfc393-0ddb-4bde-9b1f-2a48549f4549\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cpfzk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:42Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:42 crc kubenswrapper[4685]: I0321 03:47:42.134818 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mlsb2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b79f01f-bf05-4f7d-b816-6ef01f21e949\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://415d35423a66454f5ed06ba8facb7adbda91fae370e4d1d8a022cf7002e7b8bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6tf6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mlsb2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:42Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:42 crc kubenswrapper[4685]: I0321 03:47:42.148053 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ztl6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dea34cf-d6c6-42fd-b4aa-8e175c6a78f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438ee4ce0e664eb23ee83b20eecbc81dd4f5ad12e50e98785be3696b58528952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sc2qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ztl6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:42Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:42 crc kubenswrapper[4685]: I0321 03:47:42.163079 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jrr5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2521a678-ad6c-464b-bf7b-c4f6237c2822\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://749ea865e46b19e2475b94b696bb2be8bc27add3f2ba2420c38b9d67b2b8ddd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc29c132abc0c440dd96bf14118de27e46aa41322e61ce3d685eb43962330cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jrr5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:42Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:42 crc kubenswrapper[4685]: I0321 03:47:42.177864 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:42Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:42 crc kubenswrapper[4685]: I0321 03:47:42.205405 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:42 crc kubenswrapper[4685]: I0321 03:47:42.205452 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:42 crc kubenswrapper[4685]: I0321 03:47:42.205465 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:42 crc kubenswrapper[4685]: I0321 03:47:42.205486 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:42 crc kubenswrapper[4685]: I0321 03:47:42.205502 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:42Z","lastTransitionTime":"2026-03-21T03:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:42 crc kubenswrapper[4685]: I0321 03:47:42.307772 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:42 crc kubenswrapper[4685]: I0321 03:47:42.307827 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:42 crc kubenswrapper[4685]: I0321 03:47:42.307853 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:42 crc kubenswrapper[4685]: I0321 03:47:42.307876 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:42 crc kubenswrapper[4685]: I0321 03:47:42.307889 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:42Z","lastTransitionTime":"2026-03-21T03:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:42 crc kubenswrapper[4685]: I0321 03:47:42.411208 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:42 crc kubenswrapper[4685]: I0321 03:47:42.411251 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:42 crc kubenswrapper[4685]: I0321 03:47:42.411262 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:42 crc kubenswrapper[4685]: I0321 03:47:42.411281 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:42 crc kubenswrapper[4685]: I0321 03:47:42.411297 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:42Z","lastTransitionTime":"2026-03-21T03:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:42 crc kubenswrapper[4685]: I0321 03:47:42.515188 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:42 crc kubenswrapper[4685]: I0321 03:47:42.515498 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:42 crc kubenswrapper[4685]: I0321 03:47:42.515506 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:42 crc kubenswrapper[4685]: I0321 03:47:42.515524 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:42 crc kubenswrapper[4685]: I0321 03:47:42.515536 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:42Z","lastTransitionTime":"2026-03-21T03:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:42 crc kubenswrapper[4685]: I0321 03:47:42.618478 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:42 crc kubenswrapper[4685]: I0321 03:47:42.618518 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:42 crc kubenswrapper[4685]: I0321 03:47:42.618532 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:42 crc kubenswrapper[4685]: I0321 03:47:42.618550 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:42 crc kubenswrapper[4685]: I0321 03:47:42.618567 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:42Z","lastTransitionTime":"2026-03-21T03:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:42 crc kubenswrapper[4685]: I0321 03:47:42.721337 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:42 crc kubenswrapper[4685]: I0321 03:47:42.721378 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:42 crc kubenswrapper[4685]: I0321 03:47:42.721389 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:42 crc kubenswrapper[4685]: I0321 03:47:42.721406 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:42 crc kubenswrapper[4685]: I0321 03:47:42.721417 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:42Z","lastTransitionTime":"2026-03-21T03:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:42 crc kubenswrapper[4685]: I0321 03:47:42.824589 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:42 crc kubenswrapper[4685]: I0321 03:47:42.824643 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:42 crc kubenswrapper[4685]: I0321 03:47:42.824654 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:42 crc kubenswrapper[4685]: I0321 03:47:42.824676 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:42 crc kubenswrapper[4685]: I0321 03:47:42.824696 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:42Z","lastTransitionTime":"2026-03-21T03:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:42 crc kubenswrapper[4685]: I0321 03:47:42.927610 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:42 crc kubenswrapper[4685]: I0321 03:47:42.927642 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:42 crc kubenswrapper[4685]: I0321 03:47:42.927650 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:42 crc kubenswrapper[4685]: I0321 03:47:42.927665 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:42 crc kubenswrapper[4685]: I0321 03:47:42.927675 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:42Z","lastTransitionTime":"2026-03-21T03:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:42 crc kubenswrapper[4685]: I0321 03:47:42.968175 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 03:47:42 crc kubenswrapper[4685]: I0321 03:47:42.968315 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 03:47:42 crc kubenswrapper[4685]: E0321 03:47:42.968467 4685 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 21 03:47:42 crc kubenswrapper[4685]: E0321 03:47:42.968546 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 03:47:46.96852301 +0000 UTC m=+99.445591802 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 21 03:47:42 crc kubenswrapper[4685]: E0321 03:47:42.969102 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 03:47:46.969066086 +0000 UTC m=+99.446134878 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:47:42 crc kubenswrapper[4685]: I0321 03:47:42.978110 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" event={"ID":"08dfc393-0ddb-4bde-9b1f-2a48549f4549","Type":"ContainerStarted","Data":"34752fc66f7ecfa5c0a9a761bc6a69d21ab1fa332ecd79948cb6b1ea34a8a8b8"} Mar 21 03:47:42 crc kubenswrapper[4685]: I0321 03:47:42.979395 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"9cedbf5aa59177e67115d9dbf436805563022b6cebdd1c909d460d7997af5333"} Mar 21 03:47:42 crc kubenswrapper[4685]: I0321 03:47:42.982501 4685 generic.go:334] "Generic (PLEG): container finished" podID="a5cb18bc-bda7-463e-98fe-6d8ff293b949" containerID="8d2cd7978a2b2d5a571096a1f0172b76414dadc75fb54395e05d1f344f04e1a0" exitCode=0 Mar 21 03:47:42 crc kubenswrapper[4685]: I0321 03:47:42.982586 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6xvsf" event={"ID":"a5cb18bc-bda7-463e-98fe-6d8ff293b949","Type":"ContainerDied","Data":"8d2cd7978a2b2d5a571096a1f0172b76414dadc75fb54395e05d1f344f04e1a0"} Mar 21 03:47:42 crc kubenswrapper[4685]: I0321 03:47:42.998682 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6xvsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5cb18bc-bda7-463e-98fe-6d8ff293b949\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d8efc9f2ad837ea5b63d39bb04316943c7562b645f8f78b12e54d430f2a4a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4d8efc9f2ad837ea5b63d39bb04316943c7562b645f8f78b12e54d430f2a4a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d2cd7978a2b2d5a571096a1f0172b76414dadc75fb54395e05d1f344f04e1a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6xvsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:42Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:43 crc kubenswrapper[4685]: I0321 03:47:43.013714 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7jcm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd9b1743-6b69-46d3-a429-6f83bf43317a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a947393ce78ff4c9dddd93eb5ee7cb034673ebb6f0391cb868b473028074aa46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x266z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7jcm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:43Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:43 crc kubenswrapper[4685]: I0321 03:47:43.025286 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cea46fe2-4e41-43ab-a069-cb30fb4e732c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f8af9aaf0940271c159c00b04ffb7a1dd9471caa93ad4c7bd4507461a095923\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5ntg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://682dda84970818843427fd441cf0359877cff12a6a10a7f7fad13d60c5ca62f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5ntg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7r9cg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:43Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:43 crc kubenswrapper[4685]: I0321 03:47:43.030083 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:43 crc kubenswrapper[4685]: I0321 03:47:43.030117 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:43 crc kubenswrapper[4685]: I0321 03:47:43.030125 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:43 crc kubenswrapper[4685]: I0321 03:47:43.030142 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:43 crc kubenswrapper[4685]: I0321 03:47:43.030154 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:43Z","lastTransitionTime":"2026-03-21T03:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:43 crc kubenswrapper[4685]: I0321 03:47:43.040484 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aab80bc7857e9a3d86f540c725f4e35d0d86f979abed1ea5c9cddb4a6263924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d981cec0491154ace10ce4f437d1101c8dd69c34129a42b7dd5af226f8b14b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:43Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:43 crc kubenswrapper[4685]: I0321 03:47:43.053467 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cedbf5aa59177e67115d9dbf436805563022b6cebdd1c909d460d7997af5333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:43Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:43 crc kubenswrapper[4685]: I0321 03:47:43.069732 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 03:47:43 crc kubenswrapper[4685]: I0321 03:47:43.069783 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 03:47:43 crc kubenswrapper[4685]: I0321 03:47:43.069824 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 03:47:43 crc kubenswrapper[4685]: I0321 03:47:43.069861 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fda9b1ff-e4a8-4d15-8f7b-2974991cd252-metrics-certs\") pod \"network-metrics-daemon-v9rdl\" (UID: \"fda9b1ff-e4a8-4d15-8f7b-2974991cd252\") " pod="openshift-multus/network-metrics-daemon-v9rdl" Mar 21 03:47:43 crc kubenswrapper[4685]: E0321 03:47:43.070001 4685 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 21 03:47:43 crc kubenswrapper[4685]: E0321 03:47:43.070034 4685 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 21 03:47:43 crc kubenswrapper[4685]: E0321 03:47:43.070053 4685 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 03:47:43 crc kubenswrapper[4685]: E0321 03:47:43.070136 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-21 03:47:47.070108686 +0000 UTC m=+99.547177478 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 03:47:43 crc kubenswrapper[4685]: E0321 03:47:43.070773 4685 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 21 03:47:43 crc kubenswrapper[4685]: E0321 03:47:43.070803 4685 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 21 03:47:43 crc kubenswrapper[4685]: E0321 03:47:43.070816 4685 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 03:47:43 crc kubenswrapper[4685]: E0321 03:47:43.070873 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-21 03:47:47.070860488 +0000 UTC m=+99.547929490 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 03:47:43 crc kubenswrapper[4685]: E0321 03:47:43.070965 4685 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 21 03:47:43 crc kubenswrapper[4685]: E0321 03:47:43.071009 4685 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 21 03:47:43 crc kubenswrapper[4685]: E0321 03:47:43.071071 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 03:47:47.071051923 +0000 UTC m=+99.548120715 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 21 03:47:43 crc kubenswrapper[4685]: E0321 03:47:43.071092 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fda9b1ff-e4a8-4d15-8f7b-2974991cd252-metrics-certs podName:fda9b1ff-e4a8-4d15-8f7b-2974991cd252 nodeName:}" failed. No retries permitted until 2026-03-21 03:47:47.071080494 +0000 UTC m=+99.548149516 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fda9b1ff-e4a8-4d15-8f7b-2974991cd252-metrics-certs") pod "network-metrics-daemon-v9rdl" (UID: "fda9b1ff-e4a8-4d15-8f7b-2974991cd252") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 21 03:47:43 crc kubenswrapper[4685]: I0321 03:47:43.072356 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df1f675def99930733ba3d2e467e3c07307a00cd8a10f0c3dee668de18f92dbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:43Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:43 crc kubenswrapper[4685]: I0321 03:47:43.086762 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:43Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:43 crc kubenswrapper[4685]: I0321 03:47:43.108274 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:43Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:43 crc kubenswrapper[4685]: I0321 03:47:43.126711 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-v9rdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fda9b1ff-e4a8-4d15-8f7b-2974991cd252\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrczq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrczq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-v9rdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:43Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:43 crc kubenswrapper[4685]: I0321 03:47:43.133000 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:43 crc kubenswrapper[4685]: I0321 03:47:43.133082 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:43 crc kubenswrapper[4685]: I0321 03:47:43.133097 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:43 crc kubenswrapper[4685]: I0321 03:47:43.133120 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:43 crc kubenswrapper[4685]: I0321 03:47:43.133133 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:43Z","lastTransitionTime":"2026-03-21T03:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:43 crc kubenswrapper[4685]: I0321 03:47:43.155097 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08dfc393-0ddb-4bde-9b1f-2a48549f4549\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cpfzk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:43Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:43 crc kubenswrapper[4685]: I0321 03:47:43.168605 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mlsb2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b79f01f-bf05-4f7d-b816-6ef01f21e949\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://415d35423a66454f5ed06ba8facb7adbda91fae370e4d1d8a022cf7002e7b8bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6tf6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mlsb2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:43Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:43 crc kubenswrapper[4685]: I0321 03:47:43.182370 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ztl6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dea34cf-d6c6-42fd-b4aa-8e175c6a78f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438ee4ce0e664eb23ee83b20eecbc81dd4f5ad12e50e98785be3696b58528952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sc2qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ztl6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:43Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:43 crc kubenswrapper[4685]: I0321 03:47:43.197266 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:43Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:43 crc kubenswrapper[4685]: I0321 03:47:43.218403 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jrr5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2521a678-ad6c-464b-bf7b-c4f6237c2822\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://749ea865e46b19e2475b94b696bb2be8bc27add3f2ba2420c38b9d67b2b8ddd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc29c132abc0c440dd96bf14118de27e46aa41322e61ce3d685eb43962330cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jrr5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:43Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:43 crc kubenswrapper[4685]: I0321 03:47:43.236921 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:43 crc kubenswrapper[4685]: I0321 03:47:43.236965 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:43 crc kubenswrapper[4685]: I0321 03:47:43.236976 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:43 crc kubenswrapper[4685]: I0321 03:47:43.237023 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:43 crc kubenswrapper[4685]: I0321 03:47:43.237035 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:43Z","lastTransitionTime":"2026-03-21T03:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:43 crc kubenswrapper[4685]: I0321 03:47:43.238049 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6xvsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5cb18bc-bda7-463e-98fe-6d8ff293b949\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d8efc9f2ad837ea5b63d39bb04316943c7562b645f8f78b12e54d430f2a4a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4d8efc9f2ad837ea5b63d39bb04316943c7562b645f8f78b12e54d430f2a4a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d2cd7978a2b2d5a571096a1f0172b76414dadc75fb54395e05d1f344f04e1a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d2cd7978a2b2d5a571096a1f0172b76414dadc75fb54395e05d1f344f04e1a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6xvsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:43Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:43 crc kubenswrapper[4685]: I0321 03:47:43.251467 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7jcm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd9b1743-6b69-46d3-a429-6f83bf43317a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a947393ce78ff4c9dddd93eb5ee7cb034673ebb6f0391cb868b473028074aa46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x266z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7jcm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:43Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:43 crc kubenswrapper[4685]: I0321 03:47:43.263196 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cea46fe2-4e41-43ab-a069-cb30fb4e732c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f8af9aaf0940271c159c00b04ffb7a1dd9471caa93ad4c7bd4507461a095923\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5ntg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://682dda84970818843427fd441cf0359877cff12a6a10a7f7fad13d60c5ca62f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5ntg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7r9cg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:43Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:43 crc kubenswrapper[4685]: I0321 03:47:43.279955 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aab80bc7857e9a3d86f540c725f4e35d0d86f979abed1ea5c9cddb4a6263924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d981cec0491154ace10ce4f437d1101c8dd69c34129a42b7dd5af226f8b14b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:43Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:43 crc kubenswrapper[4685]: I0321 03:47:43.294142 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cedbf5aa59177e67115d9dbf436805563022b6cebdd1c909d460d7997af5333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:43Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:43 crc kubenswrapper[4685]: I0321 03:47:43.302387 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 03:47:43 crc kubenswrapper[4685]: I0321 03:47:43.302464 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 03:47:43 crc kubenswrapper[4685]: I0321 03:47:43.302503 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v9rdl" Mar 21 03:47:43 crc kubenswrapper[4685]: I0321 03:47:43.302477 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 03:47:43 crc kubenswrapper[4685]: E0321 03:47:43.302648 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 03:47:43 crc kubenswrapper[4685]: E0321 03:47:43.302764 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 03:47:43 crc kubenswrapper[4685]: E0321 03:47:43.302903 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v9rdl" podUID="fda9b1ff-e4a8-4d15-8f7b-2974991cd252" Mar 21 03:47:43 crc kubenswrapper[4685]: E0321 03:47:43.303099 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 03:47:43 crc kubenswrapper[4685]: I0321 03:47:43.307857 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df1f675def99930733ba3d2e467e3c07307a00cd8a10f0c3dee668de18f92dbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:43Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:43 crc kubenswrapper[4685]: I0321 03:47:43.320178 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:43Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:43 crc kubenswrapper[4685]: I0321 03:47:43.335267 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:43Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:43 crc kubenswrapper[4685]: I0321 03:47:43.340290 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:43 crc kubenswrapper[4685]: I0321 03:47:43.340337 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:43 crc kubenswrapper[4685]: I0321 03:47:43.340347 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:43 crc kubenswrapper[4685]: I0321 03:47:43.340366 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:43 crc kubenswrapper[4685]: I0321 03:47:43.340378 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:43Z","lastTransitionTime":"2026-03-21T03:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:43 crc kubenswrapper[4685]: I0321 03:47:43.346746 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-v9rdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fda9b1ff-e4a8-4d15-8f7b-2974991cd252\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrczq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrczq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-v9rdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:43Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:43 crc kubenswrapper[4685]: I0321 03:47:43.370294 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08dfc393-0ddb-4bde-9b1f-2a48549f4549\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cpfzk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:43Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:43 crc kubenswrapper[4685]: I0321 03:47:43.384809 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mlsb2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b79f01f-bf05-4f7d-b816-6ef01f21e949\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://415d35423a66454f5ed06ba8facb7adbda91fae370e4d1d8a022cf7002e7b8bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6tf6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mlsb2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:43Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:43 crc kubenswrapper[4685]: I0321 03:47:43.395925 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ztl6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dea34cf-d6c6-42fd-b4aa-8e175c6a78f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438ee4ce0e664eb23ee83b20eecbc81dd4f5ad12e50e98785be3696b58528952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sc2qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ztl6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:43Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:43 crc kubenswrapper[4685]: I0321 03:47:43.407392 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:43Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:43 crc kubenswrapper[4685]: I0321 03:47:43.422867 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jrr5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2521a678-ad6c-464b-bf7b-c4f6237c2822\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://749ea865e46b19e2475b94b696bb2be8bc27add3f2ba2420c38b9d67b2b8ddd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc29c132abc0c440dd96bf14118de27e46aa41322e61ce3d685eb43962330cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jrr5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:43Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:43 crc kubenswrapper[4685]: I0321 03:47:43.443174 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:43 crc kubenswrapper[4685]: I0321 03:47:43.443212 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:43 crc kubenswrapper[4685]: I0321 03:47:43.443224 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:43 crc kubenswrapper[4685]: I0321 03:47:43.443243 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:43 crc kubenswrapper[4685]: I0321 03:47:43.443256 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:43Z","lastTransitionTime":"2026-03-21T03:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:43 crc kubenswrapper[4685]: I0321 03:47:43.546325 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:43 crc kubenswrapper[4685]: I0321 03:47:43.546367 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:43 crc kubenswrapper[4685]: I0321 03:47:43.546380 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:43 crc kubenswrapper[4685]: I0321 03:47:43.546397 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:43 crc kubenswrapper[4685]: I0321 03:47:43.546409 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:43Z","lastTransitionTime":"2026-03-21T03:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:43 crc kubenswrapper[4685]: I0321 03:47:43.650559 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:43 crc kubenswrapper[4685]: I0321 03:47:43.650600 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:43 crc kubenswrapper[4685]: I0321 03:47:43.650611 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:43 crc kubenswrapper[4685]: I0321 03:47:43.650628 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:43 crc kubenswrapper[4685]: I0321 03:47:43.650640 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:43Z","lastTransitionTime":"2026-03-21T03:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:43 crc kubenswrapper[4685]: I0321 03:47:43.753077 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:43 crc kubenswrapper[4685]: I0321 03:47:43.753124 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:43 crc kubenswrapper[4685]: I0321 03:47:43.753135 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:43 crc kubenswrapper[4685]: I0321 03:47:43.753151 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:43 crc kubenswrapper[4685]: I0321 03:47:43.753163 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:43Z","lastTransitionTime":"2026-03-21T03:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:43 crc kubenswrapper[4685]: I0321 03:47:43.856390 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:43 crc kubenswrapper[4685]: I0321 03:47:43.856442 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:43 crc kubenswrapper[4685]: I0321 03:47:43.856454 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:43 crc kubenswrapper[4685]: I0321 03:47:43.856473 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:43 crc kubenswrapper[4685]: I0321 03:47:43.856483 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:43Z","lastTransitionTime":"2026-03-21T03:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:43 crc kubenswrapper[4685]: I0321 03:47:43.959473 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:43 crc kubenswrapper[4685]: I0321 03:47:43.959518 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:43 crc kubenswrapper[4685]: I0321 03:47:43.959527 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:43 crc kubenswrapper[4685]: I0321 03:47:43.959545 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:43 crc kubenswrapper[4685]: I0321 03:47:43.959556 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:43Z","lastTransitionTime":"2026-03-21T03:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:43 crc kubenswrapper[4685]: I0321 03:47:43.987720 4685 generic.go:334] "Generic (PLEG): container finished" podID="a5cb18bc-bda7-463e-98fe-6d8ff293b949" containerID="2188c1bcb129415bc4e9a76339bad4f973c3778a1c5a0735b15a8e0ccc17f5ef" exitCode=0 Mar 21 03:47:43 crc kubenswrapper[4685]: I0321 03:47:43.987866 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6xvsf" event={"ID":"a5cb18bc-bda7-463e-98fe-6d8ff293b949","Type":"ContainerDied","Data":"2188c1bcb129415bc4e9a76339bad4f973c3778a1c5a0735b15a8e0ccc17f5ef"} Mar 21 03:47:44 crc kubenswrapper[4685]: I0321 03:47:44.002310 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-v9rdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fda9b1ff-e4a8-4d15-8f7b-2974991cd252\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrczq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrczq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-v9rdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:44Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:44 crc kubenswrapper[4685]: I0321 03:47:44.020525 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08dfc393-0ddb-4bde-9b1f-2a48549f4549\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cpfzk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:44Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:44 crc kubenswrapper[4685]: I0321 03:47:44.038933 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mlsb2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b79f01f-bf05-4f7d-b816-6ef01f21e949\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://415d35423a66454f5ed06ba8facb7adbda91fae370e4d1d8a022cf7002e7b8bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6tf6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mlsb2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:44Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:44 crc kubenswrapper[4685]: I0321 03:47:44.053988 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df1f675def99930733ba3d2e467e3c07307a00cd8a10f0c3dee668de18f92dbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:44Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:44 crc kubenswrapper[4685]: I0321 03:47:44.066118 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:44 crc kubenswrapper[4685]: I0321 03:47:44.066161 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:44 crc kubenswrapper[4685]: I0321 03:47:44.066172 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:44 crc kubenswrapper[4685]: I0321 03:47:44.066191 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:44 crc kubenswrapper[4685]: I0321 03:47:44.066205 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:44Z","lastTransitionTime":"2026-03-21T03:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:44 crc kubenswrapper[4685]: I0321 03:47:44.066277 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:44Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:44 crc kubenswrapper[4685]: I0321 03:47:44.077610 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:44Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:44 crc kubenswrapper[4685]: I0321 03:47:44.090886 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ztl6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dea34cf-d6c6-42fd-b4aa-8e175c6a78f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438ee4ce0e664eb23ee83b20eecbc81dd4f5ad12e50e98785be3696b58528952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sc2qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ztl6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:44Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:44 crc kubenswrapper[4685]: I0321 03:47:44.103177 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:44Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:44 crc kubenswrapper[4685]: I0321 03:47:44.115495 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jrr5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2521a678-ad6c-464b-bf7b-c4f6237c2822\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://749ea865e46b19e2475b94b696bb2be8bc27add3f2ba2420c38b9d67b2b8ddd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc29c132abc0c440dd96bf14118de27e46aa41322e61ce3d685eb43962330cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jrr5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:44Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:44 crc kubenswrapper[4685]: I0321 03:47:44.127501 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aab80bc7857e9a3d86f540c725f4e35d0d86f979abed1ea5c9cddb4a6263924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d981cec0491154ace10ce4f437d1101c8dd69c34129a42b7dd5af226f8b14b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:44Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:44 crc kubenswrapper[4685]: I0321 03:47:44.140671 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6xvsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5cb18bc-bda7-463e-98fe-6d8ff293b949\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d8efc9f2ad837ea5b63d39bb04316943c7562b645f8f78b12e54d430f2a4a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4d8efc9f2ad837ea5b63d39bb04316943c7562b645f8f78b12e54d430f2a4a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d2cd7978a2b2d5a571096a1f0172b76414dadc75fb54395e05d1f344f04e1a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d2cd7978a2b2d5a571096a1f0172b76414dadc75fb54395e05d1f344f04e1a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2188c1bcb129415bc4e9a76339bad4f973c3778a1c5a0735b15a8e0ccc17f5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2188c1bcb129415bc4e9a76339bad4f973c3778a1c5a0735b15a8e0ccc17f5ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6xvsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:44Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:44 crc kubenswrapper[4685]: I0321 03:47:44.156598 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7jcm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd9b1743-6b69-46d3-a429-6f83bf43317a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a947393ce78ff4c9dddd93eb5ee7cb034673ebb6f0391cb868b473028074aa46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x266z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7jcm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:44Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:44 crc kubenswrapper[4685]: I0321 03:47:44.168889 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:44 crc kubenswrapper[4685]: I0321 03:47:44.168935 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:44 crc kubenswrapper[4685]: I0321 03:47:44.168948 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:44 crc kubenswrapper[4685]: I0321 03:47:44.168969 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:44 crc kubenswrapper[4685]: I0321 03:47:44.168982 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:44Z","lastTransitionTime":"2026-03-21T03:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:44 crc kubenswrapper[4685]: I0321 03:47:44.170352 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cea46fe2-4e41-43ab-a069-cb30fb4e732c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f8af9aaf0940271c159c00b04ffb7a1dd9471caa93ad4c7bd4507461a095923\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5ntg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://682dda84970818843427fd441cf0359877cff12a6a10a7f7fad13d60c5ca62f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5ntg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7r9cg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:44Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:44 crc kubenswrapper[4685]: I0321 03:47:44.182016 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cedbf5aa59177e67115d9dbf436805563022b6cebdd1c909d460d7997af5333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:44Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:44 crc kubenswrapper[4685]: I0321 03:47:44.272266 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:44 crc kubenswrapper[4685]: I0321 03:47:44.272327 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:44 crc kubenswrapper[4685]: I0321 03:47:44.272341 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:44 crc kubenswrapper[4685]: I0321 03:47:44.272365 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:44 crc kubenswrapper[4685]: I0321 03:47:44.272381 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:44Z","lastTransitionTime":"2026-03-21T03:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:44 crc kubenswrapper[4685]: I0321 03:47:44.375661 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:44 crc kubenswrapper[4685]: I0321 03:47:44.375711 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:44 crc kubenswrapper[4685]: I0321 03:47:44.375720 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:44 crc kubenswrapper[4685]: I0321 03:47:44.375740 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:44 crc kubenswrapper[4685]: I0321 03:47:44.375752 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:44Z","lastTransitionTime":"2026-03-21T03:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:44 crc kubenswrapper[4685]: I0321 03:47:44.478934 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:44 crc kubenswrapper[4685]: I0321 03:47:44.478997 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:44 crc kubenswrapper[4685]: I0321 03:47:44.479013 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:44 crc kubenswrapper[4685]: I0321 03:47:44.479041 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:44 crc kubenswrapper[4685]: I0321 03:47:44.479061 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:44Z","lastTransitionTime":"2026-03-21T03:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:44 crc kubenswrapper[4685]: I0321 03:47:44.581997 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:44 crc kubenswrapper[4685]: I0321 03:47:44.582046 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:44 crc kubenswrapper[4685]: I0321 03:47:44.582060 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:44 crc kubenswrapper[4685]: I0321 03:47:44.582083 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:44 crc kubenswrapper[4685]: I0321 03:47:44.582097 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:44Z","lastTransitionTime":"2026-03-21T03:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:44 crc kubenswrapper[4685]: I0321 03:47:44.689190 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:44 crc kubenswrapper[4685]: I0321 03:47:44.689242 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:44 crc kubenswrapper[4685]: I0321 03:47:44.689254 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:44 crc kubenswrapper[4685]: I0321 03:47:44.689281 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:44 crc kubenswrapper[4685]: I0321 03:47:44.689294 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:44Z","lastTransitionTime":"2026-03-21T03:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:44 crc kubenswrapper[4685]: I0321 03:47:44.792545 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:44 crc kubenswrapper[4685]: I0321 03:47:44.792604 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:44 crc kubenswrapper[4685]: I0321 03:47:44.792614 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:44 crc kubenswrapper[4685]: I0321 03:47:44.792650 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:44 crc kubenswrapper[4685]: I0321 03:47:44.792663 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:44Z","lastTransitionTime":"2026-03-21T03:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:44 crc kubenswrapper[4685]: I0321 03:47:44.896559 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:44 crc kubenswrapper[4685]: I0321 03:47:44.896625 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:44 crc kubenswrapper[4685]: I0321 03:47:44.896640 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:44 crc kubenswrapper[4685]: I0321 03:47:44.896666 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:44 crc kubenswrapper[4685]: I0321 03:47:44.896683 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:44Z","lastTransitionTime":"2026-03-21T03:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:45 crc kubenswrapper[4685]: I0321 03:47:45.001771 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" event={"ID":"08dfc393-0ddb-4bde-9b1f-2a48549f4549","Type":"ContainerStarted","Data":"86328f182f7db62c6133733a4a4b9171f479c5b134c11250bb1839234544f485"} Mar 21 03:47:45 crc kubenswrapper[4685]: I0321 03:47:45.001951 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:45 crc kubenswrapper[4685]: I0321 03:47:45.002032 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:45 crc kubenswrapper[4685]: I0321 03:47:45.002055 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:45 crc kubenswrapper[4685]: I0321 03:47:45.002091 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:45 crc kubenswrapper[4685]: I0321 03:47:45.002116 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:45Z","lastTransitionTime":"2026-03-21T03:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:45 crc kubenswrapper[4685]: I0321 03:47:45.005064 4685 generic.go:334] "Generic (PLEG): container finished" podID="a5cb18bc-bda7-463e-98fe-6d8ff293b949" containerID="c1417d49afce3f61158d9158a6c9a55e186a1032fb3c4ad275c63b6c9df81dd3" exitCode=0 Mar 21 03:47:45 crc kubenswrapper[4685]: I0321 03:47:45.005133 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6xvsf" event={"ID":"a5cb18bc-bda7-463e-98fe-6d8ff293b949","Type":"ContainerDied","Data":"c1417d49afce3f61158d9158a6c9a55e186a1032fb3c4ad275c63b6c9df81dd3"} Mar 21 03:47:45 crc kubenswrapper[4685]: I0321 03:47:45.023325 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08dfc393-0ddb-4bde-9b1f-2a48549f4549\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cpfzk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:45Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:45 crc kubenswrapper[4685]: I0321 03:47:45.038532 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mlsb2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b79f01f-bf05-4f7d-b816-6ef01f21e949\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://415d35423a66454f5ed06ba8facb7adbda91fae370e4d1d8a022cf7002e7b8bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6tf6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mlsb2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:45Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:45 crc kubenswrapper[4685]: I0321 03:47:45.057371 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df1f675def99930733ba3d2e467e3c07307a00cd8a10f0c3dee668de18f92dbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:45Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:45 crc kubenswrapper[4685]: I0321 03:47:45.077505 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:45Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:45 crc kubenswrapper[4685]: I0321 03:47:45.099185 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:45Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:45 crc kubenswrapper[4685]: I0321 03:47:45.105604 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:45 crc kubenswrapper[4685]: I0321 03:47:45.105641 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:45 crc kubenswrapper[4685]: I0321 03:47:45.105653 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:45 crc kubenswrapper[4685]: I0321 03:47:45.105675 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:45 crc kubenswrapper[4685]: I0321 03:47:45.105689 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:45Z","lastTransitionTime":"2026-03-21T03:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:45 crc kubenswrapper[4685]: I0321 03:47:45.112777 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-v9rdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fda9b1ff-e4a8-4d15-8f7b-2974991cd252\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrczq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrczq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-v9rdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:45Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:45 crc kubenswrapper[4685]: I0321 03:47:45.125818 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ztl6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dea34cf-d6c6-42fd-b4aa-8e175c6a78f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438ee4ce0e664eb23ee83b20eecbc81dd4f5ad12e50e98785be3696b58528952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sc2qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ztl6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:45Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:45 crc kubenswrapper[4685]: I0321 03:47:45.141249 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:45Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:45 crc kubenswrapper[4685]: I0321 03:47:45.155816 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jrr5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2521a678-ad6c-464b-bf7b-c4f6237c2822\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://749ea865e46b19e2475b94b696bb2be8bc27add3f2ba2420c38b9d67b2b8ddd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc29c132abc0c440dd96bf14118de27e46aa41322e61ce3d685eb43962330cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jrr5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:45Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:45 crc kubenswrapper[4685]: I0321 03:47:45.176549 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aab80bc7857e9a3d86f540c725f4e35d0d86f979abed1ea5c9cddb4a6263924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d981cec0491154ace10ce4f437d1101c8dd69c34129a42b7dd5af226f8b14b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:45Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:45 crc kubenswrapper[4685]: I0321 03:47:45.190691 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6xvsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5cb18bc-bda7-463e-98fe-6d8ff293b949\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d8efc9f2ad837ea5b63d39bb04316943c7562b645f8f78b12e54d430f2a4a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4d8efc9f2ad837ea5b63d39bb04316943c7562b645f8f78b12e54d430f2a4a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d2cd7978a2b2d5a571096a1f0172b76414dadc75fb54395e05d1f344f04e1a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d2cd7978a2b2d5a571096a1f0172b76414dadc75fb54395e05d1f344f04e1a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2188c1bcb129415bc4e9a76339bad4f973c3778a1c5a0735b15a8e0ccc17f5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2188c1bcb129415bc4e9a76339bad4f973c3778a1c5a0735b15a8e0ccc17f5ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1417d49afce3f61158d9158a6c9a55e186a1032fb3c4ad275c63b6c9df81dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1417d49afce3f61158d9158a6c9a55e186a1032fb3c4ad275c63b6c9df81dd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6xvsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:45Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:45 crc kubenswrapper[4685]: I0321 03:47:45.201811 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7jcm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd9b1743-6b69-46d3-a429-6f83bf43317a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a947393ce78ff4c9dddd93eb5ee7cb034673ebb6f0391cb868b473028074aa46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x266z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7jcm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:45Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:45 crc kubenswrapper[4685]: I0321 03:47:45.209679 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:45 crc kubenswrapper[4685]: I0321 03:47:45.209730 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:45 crc kubenswrapper[4685]: I0321 03:47:45.209760 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:45 crc kubenswrapper[4685]: I0321 03:47:45.209779 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:45 crc kubenswrapper[4685]: I0321 03:47:45.209792 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:45Z","lastTransitionTime":"2026-03-21T03:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:45 crc kubenswrapper[4685]: I0321 03:47:45.212444 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cea46fe2-4e41-43ab-a069-cb30fb4e732c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f8af9aaf0940271c159c00b04ffb7a1dd9471caa93ad4c7bd4507461a095923\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5ntg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://682dda84970818843427fd441cf0359877cff12a6a10a7f7fad13d60c5ca62f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5ntg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7r9cg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:45Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:45 crc kubenswrapper[4685]: I0321 03:47:45.223528 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cedbf5aa59177e67115d9dbf436805563022b6cebdd1c909d460d7997af5333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:45Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:45 crc kubenswrapper[4685]: I0321 03:47:45.300037 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v9rdl" Mar 21 03:47:45 crc kubenswrapper[4685]: I0321 03:47:45.300079 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 03:47:45 crc kubenswrapper[4685]: I0321 03:47:45.300110 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 03:47:45 crc kubenswrapper[4685]: I0321 03:47:45.300134 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 03:47:45 crc kubenswrapper[4685]: E0321 03:47:45.300864 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v9rdl" podUID="fda9b1ff-e4a8-4d15-8f7b-2974991cd252" Mar 21 03:47:45 crc kubenswrapper[4685]: E0321 03:47:45.300969 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 03:47:45 crc kubenswrapper[4685]: E0321 03:47:45.301060 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 03:47:45 crc kubenswrapper[4685]: E0321 03:47:45.301111 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 03:47:45 crc kubenswrapper[4685]: I0321 03:47:45.312458 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:45 crc kubenswrapper[4685]: I0321 03:47:45.312487 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:45 crc kubenswrapper[4685]: I0321 03:47:45.312497 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:45 crc kubenswrapper[4685]: I0321 03:47:45.312512 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:45 crc kubenswrapper[4685]: I0321 03:47:45.312523 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:45Z","lastTransitionTime":"2026-03-21T03:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:45 crc kubenswrapper[4685]: I0321 03:47:45.417075 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:45 crc kubenswrapper[4685]: I0321 03:47:45.417134 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:45 crc kubenswrapper[4685]: I0321 03:47:45.417156 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:45 crc kubenswrapper[4685]: I0321 03:47:45.417184 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:45 crc kubenswrapper[4685]: I0321 03:47:45.417201 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:45Z","lastTransitionTime":"2026-03-21T03:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:45 crc kubenswrapper[4685]: I0321 03:47:45.521286 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:45 crc kubenswrapper[4685]: I0321 03:47:45.521361 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:45 crc kubenswrapper[4685]: I0321 03:47:45.521381 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:45 crc kubenswrapper[4685]: I0321 03:47:45.521406 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:45 crc kubenswrapper[4685]: I0321 03:47:45.521422 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:45Z","lastTransitionTime":"2026-03-21T03:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:45 crc kubenswrapper[4685]: I0321 03:47:45.624389 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:45 crc kubenswrapper[4685]: I0321 03:47:45.624496 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:45 crc kubenswrapper[4685]: I0321 03:47:45.624524 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:45 crc kubenswrapper[4685]: I0321 03:47:45.624563 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:45 crc kubenswrapper[4685]: I0321 03:47:45.624591 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:45Z","lastTransitionTime":"2026-03-21T03:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:45 crc kubenswrapper[4685]: I0321 03:47:45.727963 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:45 crc kubenswrapper[4685]: I0321 03:47:45.728033 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:45 crc kubenswrapper[4685]: I0321 03:47:45.728051 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:45 crc kubenswrapper[4685]: I0321 03:47:45.728079 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:45 crc kubenswrapper[4685]: I0321 03:47:45.728096 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:45Z","lastTransitionTime":"2026-03-21T03:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:45 crc kubenswrapper[4685]: I0321 03:47:45.831270 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:45 crc kubenswrapper[4685]: I0321 03:47:45.831306 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:45 crc kubenswrapper[4685]: I0321 03:47:45.831315 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:45 crc kubenswrapper[4685]: I0321 03:47:45.831331 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:45 crc kubenswrapper[4685]: I0321 03:47:45.831343 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:45Z","lastTransitionTime":"2026-03-21T03:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:45 crc kubenswrapper[4685]: I0321 03:47:45.933292 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:45 crc kubenswrapper[4685]: I0321 03:47:45.933322 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:45 crc kubenswrapper[4685]: I0321 03:47:45.933331 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:45 crc kubenswrapper[4685]: I0321 03:47:45.933346 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:45 crc kubenswrapper[4685]: I0321 03:47:45.933360 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:45Z","lastTransitionTime":"2026-03-21T03:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:46 crc kubenswrapper[4685]: I0321 03:47:46.012187 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6xvsf" event={"ID":"a5cb18bc-bda7-463e-98fe-6d8ff293b949","Type":"ContainerStarted","Data":"0f6759044b25748d81e11825c925bc83b0caa70bc3bc5f930554ac6f8a19cabb"} Mar 21 03:47:46 crc kubenswrapper[4685]: I0321 03:47:46.031734 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aab80bc7857e9a3d86f540c725f4e35d0d86f979abed1ea5c9cddb4a6263924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d981cec0491154ace10ce4f437d1101c8dd69c34129a42b7dd5af226f8b14b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:46Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:46 crc kubenswrapper[4685]: I0321 03:47:46.037242 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:46 crc kubenswrapper[4685]: I0321 03:47:46.037310 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:46 crc kubenswrapper[4685]: I0321 03:47:46.037325 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:46 crc kubenswrapper[4685]: I0321 03:47:46.037349 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:46 crc kubenswrapper[4685]: I0321 03:47:46.037366 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:46Z","lastTransitionTime":"2026-03-21T03:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:46 crc kubenswrapper[4685]: I0321 03:47:46.055702 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6xvsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5cb18bc-bda7-463e-98fe-6d8ff293b949\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d8efc9f2ad837ea5b63d39bb04316943c7562b645f8f78b12e54d430f2a4a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4d8efc9f2ad837ea5b63d39bb04316943c7562b645f8f78b12e54d430f2a4a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d2cd7978a2b2d5a571096a1f0172b76414dadc75fb54395e05d1f344f04e1a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d2cd7978a2b2d5a571096a1f0172b76414dadc75fb54395e05d1f344f04e1a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2188c1bcb129415bc4e9a76339bad4f973c3778a1c5a0735b15a8e0ccc17f5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2188c1bcb129415bc4e9a76339bad4f973c3778a1c5a0735b15a8e0ccc17f5ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1417d49afce3f61158d9158a6c9a55e186a1032fb3c4ad275c63b6c9df81dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1417d49afce3f61158d9158a6c9a55e186a1032fb3c4ad275c63b6c9df81dd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f6759044b25748d81e11825c925bc83b0caa70bc3bc5f930554ac6f8a19cabb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6xvsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:46Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:46 crc kubenswrapper[4685]: I0321 03:47:46.073227 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7jcm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd9b1743-6b69-46d3-a429-6f83bf43317a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a947393ce78ff4c9dddd93eb5ee7cb034673ebb6f0391cb868b473028074aa46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x266z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7jcm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:46Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:46 crc kubenswrapper[4685]: I0321 03:47:46.088666 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cea46fe2-4e41-43ab-a069-cb30fb4e732c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f8af9aaf0940271c159c00b04ffb7a1dd9471caa93ad4c7bd4507461a095923\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5ntg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://682dda84970818843427fd441cf0359877cff12a6a10a7f7fad13d60c5ca62f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5ntg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7r9cg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:46Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:46 crc kubenswrapper[4685]: I0321 03:47:46.103177 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cedbf5aa59177e67115d9dbf436805563022b6cebdd1c909d460d7997af5333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:46Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:46 crc kubenswrapper[4685]: I0321 03:47:46.116612 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-v9rdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fda9b1ff-e4a8-4d15-8f7b-2974991cd252\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrczq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrczq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-v9rdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:46Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:46 crc kubenswrapper[4685]: I0321 03:47:46.141132 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:46 crc kubenswrapper[4685]: I0321 03:47:46.141180 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:46 crc kubenswrapper[4685]: I0321 03:47:46.141197 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:46 crc kubenswrapper[4685]: I0321 03:47:46.141222 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:46 crc kubenswrapper[4685]: I0321 03:47:46.141238 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:46Z","lastTransitionTime":"2026-03-21T03:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:46 crc kubenswrapper[4685]: I0321 03:47:46.142177 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08dfc393-0ddb-4bde-9b1f-2a48549f4549\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cpfzk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:46Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:46 crc kubenswrapper[4685]: I0321 03:47:46.153720 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mlsb2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b79f01f-bf05-4f7d-b816-6ef01f21e949\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://415d35423a66454f5ed06ba8facb7adbda91fae370e4d1d8a022cf7002e7b8bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6tf6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mlsb2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:46Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:46 crc kubenswrapper[4685]: I0321 03:47:46.168973 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df1f675def99930733ba3d2e467e3c07307a00cd8a10f0c3dee668de18f92dbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:46Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:46 crc kubenswrapper[4685]: I0321 03:47:46.183717 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:46Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:46 crc kubenswrapper[4685]: I0321 03:47:46.200912 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:46Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:46 crc kubenswrapper[4685]: I0321 03:47:46.211440 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ztl6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dea34cf-d6c6-42fd-b4aa-8e175c6a78f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438ee4ce0e664eb23ee83b20eecbc81dd4f5ad12e50e98785be3696b58528952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sc2qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ztl6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:46Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:46 crc kubenswrapper[4685]: I0321 03:47:46.226772 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:46Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:46 crc kubenswrapper[4685]: I0321 03:47:46.240274 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jrr5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2521a678-ad6c-464b-bf7b-c4f6237c2822\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://749ea865e46b19e2475b94b696bb2be8bc27add3f2ba2420c38b9d67b2b8ddd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc29c132abc0c440dd96bf14118de27e46aa41322e61ce3d685eb43962330cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jrr5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:46Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:46 crc kubenswrapper[4685]: I0321 03:47:46.245072 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:46 crc kubenswrapper[4685]: I0321 03:47:46.245122 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:46 crc kubenswrapper[4685]: I0321 03:47:46.245132 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:46 crc kubenswrapper[4685]: I0321 03:47:46.245151 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:46 crc kubenswrapper[4685]: I0321 03:47:46.245163 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:46Z","lastTransitionTime":"2026-03-21T03:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:46 crc kubenswrapper[4685]: I0321 03:47:46.347380 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:46 crc kubenswrapper[4685]: I0321 03:47:46.347422 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:46 crc kubenswrapper[4685]: I0321 03:47:46.347433 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:46 crc kubenswrapper[4685]: I0321 03:47:46.347447 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:46 crc kubenswrapper[4685]: I0321 03:47:46.347460 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:46Z","lastTransitionTime":"2026-03-21T03:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:46 crc kubenswrapper[4685]: I0321 03:47:46.449482 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:46 crc kubenswrapper[4685]: I0321 03:47:46.449529 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:46 crc kubenswrapper[4685]: I0321 03:47:46.449541 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:46 crc kubenswrapper[4685]: I0321 03:47:46.449562 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:46 crc kubenswrapper[4685]: I0321 03:47:46.449581 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:46Z","lastTransitionTime":"2026-03-21T03:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:46 crc kubenswrapper[4685]: I0321 03:47:46.552759 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:46 crc kubenswrapper[4685]: I0321 03:47:46.552821 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:46 crc kubenswrapper[4685]: I0321 03:47:46.552862 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:46 crc kubenswrapper[4685]: I0321 03:47:46.552895 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:46 crc kubenswrapper[4685]: I0321 03:47:46.552914 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:46Z","lastTransitionTime":"2026-03-21T03:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:46 crc kubenswrapper[4685]: I0321 03:47:46.656019 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:46 crc kubenswrapper[4685]: I0321 03:47:46.656073 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:46 crc kubenswrapper[4685]: I0321 03:47:46.656086 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:46 crc kubenswrapper[4685]: I0321 03:47:46.656105 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:46 crc kubenswrapper[4685]: I0321 03:47:46.656118 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:46Z","lastTransitionTime":"2026-03-21T03:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:46 crc kubenswrapper[4685]: I0321 03:47:46.758940 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:46 crc kubenswrapper[4685]: I0321 03:47:46.758993 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:46 crc kubenswrapper[4685]: I0321 03:47:46.759002 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:46 crc kubenswrapper[4685]: I0321 03:47:46.759023 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:46 crc kubenswrapper[4685]: I0321 03:47:46.759036 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:46Z","lastTransitionTime":"2026-03-21T03:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:46 crc kubenswrapper[4685]: I0321 03:47:46.862484 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:46 crc kubenswrapper[4685]: I0321 03:47:46.862523 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:46 crc kubenswrapper[4685]: I0321 03:47:46.862531 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:46 crc kubenswrapper[4685]: I0321 03:47:46.862546 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:46 crc kubenswrapper[4685]: I0321 03:47:46.862557 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:46Z","lastTransitionTime":"2026-03-21T03:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:46 crc kubenswrapper[4685]: I0321 03:47:46.965991 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:46 crc kubenswrapper[4685]: I0321 03:47:46.966055 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:46 crc kubenswrapper[4685]: I0321 03:47:46.966069 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:46 crc kubenswrapper[4685]: I0321 03:47:46.966091 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:46 crc kubenswrapper[4685]: I0321 03:47:46.966108 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:46Z","lastTransitionTime":"2026-03-21T03:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:47 crc kubenswrapper[4685]: I0321 03:47:47.016887 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 03:47:47 crc kubenswrapper[4685]: I0321 03:47:47.017142 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 03:47:47 crc kubenswrapper[4685]: E0321 03:47:47.017458 4685 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 21 03:47:47 crc kubenswrapper[4685]: E0321 03:47:47.017573 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 03:47:55.017541478 +0000 UTC m=+107.494610320 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 21 03:47:47 crc kubenswrapper[4685]: E0321 03:47:47.018254 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 03:47:55.018206387 +0000 UTC m=+107.495275289 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:47:47 crc kubenswrapper[4685]: I0321 03:47:47.032403 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" event={"ID":"08dfc393-0ddb-4bde-9b1f-2a48549f4549","Type":"ContainerStarted","Data":"280568a03157d6636f59b89e6215d0de36b0dfc4b96ded402153364088f108b4"} Mar 21 03:47:47 crc kubenswrapper[4685]: I0321 03:47:47.068991 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:47 crc kubenswrapper[4685]: I0321 03:47:47.069048 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:47 crc kubenswrapper[4685]: I0321 03:47:47.069058 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:47 crc kubenswrapper[4685]: I0321 03:47:47.069078 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:47 crc kubenswrapper[4685]: I0321 03:47:47.069100 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:47Z","lastTransitionTime":"2026-03-21T03:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:47 crc kubenswrapper[4685]: I0321 03:47:47.118197 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 03:47:47 crc kubenswrapper[4685]: I0321 03:47:47.118286 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 03:47:47 crc kubenswrapper[4685]: I0321 03:47:47.118363 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 03:47:47 crc kubenswrapper[4685]: I0321 03:47:47.118419 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fda9b1ff-e4a8-4d15-8f7b-2974991cd252-metrics-certs\") pod \"network-metrics-daemon-v9rdl\" (UID: \"fda9b1ff-e4a8-4d15-8f7b-2974991cd252\") " pod="openshift-multus/network-metrics-daemon-v9rdl" Mar 21 03:47:47 crc kubenswrapper[4685]: E0321 03:47:47.118464 4685 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 21 03:47:47 crc kubenswrapper[4685]: E0321 03:47:47.118494 4685 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 21 03:47:47 crc kubenswrapper[4685]: E0321 03:47:47.118511 4685 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 21 03:47:47 crc kubenswrapper[4685]: E0321 03:47:47.118676 4685 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 03:47:47 crc kubenswrapper[4685]: E0321 03:47:47.118695 4685 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 21 03:47:47 crc kubenswrapper[4685]: E0321 03:47:47.118751 4685 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 21 03:47:47 crc kubenswrapper[4685]: E0321 03:47:47.118613 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 03:47:55.118580638 +0000 UTC m=+107.595649470 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 21 03:47:47 crc kubenswrapper[4685]: E0321 03:47:47.118690 4685 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 21 03:47:47 crc kubenswrapper[4685]: E0321 03:47:47.118775 4685 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 03:47:47 crc kubenswrapper[4685]: E0321 03:47:47.118952 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-21 03:47:55.118809145 +0000 UTC m=+107.595878007 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 03:47:47 crc kubenswrapper[4685]: E0321 03:47:47.119004 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fda9b1ff-e4a8-4d15-8f7b-2974991cd252-metrics-certs podName:fda9b1ff-e4a8-4d15-8f7b-2974991cd252 nodeName:}" failed. No retries permitted until 2026-03-21 03:47:55.11898566 +0000 UTC m=+107.596054472 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fda9b1ff-e4a8-4d15-8f7b-2974991cd252-metrics-certs") pod "network-metrics-daemon-v9rdl" (UID: "fda9b1ff-e4a8-4d15-8f7b-2974991cd252") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 21 03:47:47 crc kubenswrapper[4685]: E0321 03:47:47.119036 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-21 03:47:55.119022321 +0000 UTC m=+107.596091383 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 03:47:47 crc kubenswrapper[4685]: I0321 03:47:47.172428 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:47 crc kubenswrapper[4685]: I0321 03:47:47.172480 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:47 crc kubenswrapper[4685]: I0321 03:47:47.172492 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:47 crc kubenswrapper[4685]: I0321 03:47:47.172514 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:47 crc kubenswrapper[4685]: I0321 03:47:47.172528 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:47Z","lastTransitionTime":"2026-03-21T03:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:47 crc kubenswrapper[4685]: I0321 03:47:47.275226 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:47 crc kubenswrapper[4685]: I0321 03:47:47.275303 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:47 crc kubenswrapper[4685]: I0321 03:47:47.275323 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:47 crc kubenswrapper[4685]: I0321 03:47:47.275353 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:47 crc kubenswrapper[4685]: I0321 03:47:47.275373 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:47Z","lastTransitionTime":"2026-03-21T03:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:47 crc kubenswrapper[4685]: I0321 03:47:47.300681 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 03:47:47 crc kubenswrapper[4685]: I0321 03:47:47.300757 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v9rdl" Mar 21 03:47:47 crc kubenswrapper[4685]: I0321 03:47:47.300700 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 03:47:47 crc kubenswrapper[4685]: I0321 03:47:47.300828 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 03:47:47 crc kubenswrapper[4685]: E0321 03:47:47.301028 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 03:47:47 crc kubenswrapper[4685]: E0321 03:47:47.301189 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v9rdl" podUID="fda9b1ff-e4a8-4d15-8f7b-2974991cd252" Mar 21 03:47:47 crc kubenswrapper[4685]: E0321 03:47:47.301461 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 03:47:47 crc kubenswrapper[4685]: E0321 03:47:47.301600 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 03:47:47 crc kubenswrapper[4685]: I0321 03:47:47.378217 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:47 crc kubenswrapper[4685]: I0321 03:47:47.378287 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:47 crc kubenswrapper[4685]: I0321 03:47:47.378311 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:47 crc kubenswrapper[4685]: I0321 03:47:47.378347 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:47 crc kubenswrapper[4685]: I0321 03:47:47.378370 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:47Z","lastTransitionTime":"2026-03-21T03:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:47 crc kubenswrapper[4685]: I0321 03:47:47.481174 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:47 crc kubenswrapper[4685]: I0321 03:47:47.481264 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:47 crc kubenswrapper[4685]: I0321 03:47:47.481291 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:47 crc kubenswrapper[4685]: I0321 03:47:47.481324 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:47 crc kubenswrapper[4685]: I0321 03:47:47.481346 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:47Z","lastTransitionTime":"2026-03-21T03:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:47 crc kubenswrapper[4685]: I0321 03:47:47.584222 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:47 crc kubenswrapper[4685]: I0321 03:47:47.584761 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:47 crc kubenswrapper[4685]: I0321 03:47:47.584780 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:47 crc kubenswrapper[4685]: I0321 03:47:47.584810 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:47 crc kubenswrapper[4685]: I0321 03:47:47.584828 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:47Z","lastTransitionTime":"2026-03-21T03:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:47 crc kubenswrapper[4685]: I0321 03:47:47.687871 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:47 crc kubenswrapper[4685]: I0321 03:47:47.687926 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:47 crc kubenswrapper[4685]: I0321 03:47:47.687939 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:47 crc kubenswrapper[4685]: I0321 03:47:47.687958 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:47 crc kubenswrapper[4685]: I0321 03:47:47.687971 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:47Z","lastTransitionTime":"2026-03-21T03:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:47 crc kubenswrapper[4685]: I0321 03:47:47.790774 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:47 crc kubenswrapper[4685]: I0321 03:47:47.790825 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:47 crc kubenswrapper[4685]: I0321 03:47:47.790851 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:47 crc kubenswrapper[4685]: I0321 03:47:47.790874 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:47 crc kubenswrapper[4685]: I0321 03:47:47.790885 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:47Z","lastTransitionTime":"2026-03-21T03:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:47 crc kubenswrapper[4685]: I0321 03:47:47.893601 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:47 crc kubenswrapper[4685]: I0321 03:47:47.893655 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:47 crc kubenswrapper[4685]: I0321 03:47:47.893665 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:47 crc kubenswrapper[4685]: I0321 03:47:47.893682 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:47 crc kubenswrapper[4685]: I0321 03:47:47.893695 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:47Z","lastTransitionTime":"2026-03-21T03:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:47 crc kubenswrapper[4685]: I0321 03:47:47.996651 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:47 crc kubenswrapper[4685]: I0321 03:47:47.996751 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:47 crc kubenswrapper[4685]: I0321 03:47:47.996774 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:47 crc kubenswrapper[4685]: I0321 03:47:47.996815 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:47 crc kubenswrapper[4685]: I0321 03:47:47.996877 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:47Z","lastTransitionTime":"2026-03-21T03:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.041747 4685 generic.go:334] "Generic (PLEG): container finished" podID="a5cb18bc-bda7-463e-98fe-6d8ff293b949" containerID="0f6759044b25748d81e11825c925bc83b0caa70bc3bc5f930554ac6f8a19cabb" exitCode=0 Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.041905 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6xvsf" event={"ID":"a5cb18bc-bda7-463e-98fe-6d8ff293b949","Type":"ContainerDied","Data":"0f6759044b25748d81e11825c925bc83b0caa70bc3bc5f930554ac6f8a19cabb"} Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.042260 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.066991 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:48Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.084163 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.090821 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jrr5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2521a678-ad6c-464b-bf7b-c4f6237c2822\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://749ea865e46b19e2475b94b696bb2be8bc27add3f2ba2420c38b9d67b2b8ddd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc29c132abc0c440dd96bf14118de27e46aa41322e61ce3d685eb43962330cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jrr5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:48Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.100773 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.100918 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.100948 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.100984 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.101012 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:48Z","lastTransitionTime":"2026-03-21T03:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.127801 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aab80bc7857e9a3d86f540c725f4e35d0d86f979abed1ea5c9cddb4a6263924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d981cec0491154ace10ce4f437d1101c8dd69c34129a42b7dd5af226f8b14b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:48Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.153319 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6xvsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5cb18bc-bda7-463e-98fe-6d8ff293b949\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d8efc9f2ad837ea5b63d39bb04316943c7562b645f8f78b12e54d430f2a4a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4d8efc9f2ad837ea5b63d39bb04316943c7562b645f8f78b12e54d430f2a4a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d2cd7978a2b2d5a571096a1f0172b76414dadc75fb54395e05d1f344f04e1a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d2cd7978a2b2d5a571096a1f0172b76414dadc75fb54395e05d1f344f04e1a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2188c1bcb129415bc4e9a76339bad4f973c3778a1c5a0735b15a8e0ccc17f5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2188c1bcb129415bc4e9a76339bad4f973c3778a1c5a0735b15a8e0ccc17f5ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1417d49afce3f61158d9158a6c9a55e186a1032fb3c4ad275c63b6c9df81dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1417d49afce3f61158d9158a6c9a55e186a1032fb3c4ad275c63b6c9df81dd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f6759044b25748d81e11825c925bc83b0caa70bc3bc5f930554ac6f8a19cabb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f6759044b25748d81e11825c925bc83b0caa70bc3bc5f930554ac6f8a19cabb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6xvsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:48Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.178885 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7jcm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd9b1743-6b69-46d3-a429-6f83bf43317a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a947393ce78ff4c9dddd93eb5ee7cb034673ebb6f0391cb868b473028074aa46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x266z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7jcm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:48Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.202062 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cea46fe2-4e41-43ab-a069-cb30fb4e732c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f8af9aaf0940271c159c00b04ffb7a1dd9471caa93ad4c7bd4507461a095923\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5ntg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://682dda84970818843427fd441cf0359877cff12a6a10a7f7fad13d60c5ca62f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5ntg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7r9cg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:48Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.206276 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.206304 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.206312 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.206335 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.206345 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:48Z","lastTransitionTime":"2026-03-21T03:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.222720 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cedbf5aa59177e67115d9dbf436805563022b6cebdd1c909d460d7997af5333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:48Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.247010 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-v9rdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fda9b1ff-e4a8-4d15-8f7b-2974991cd252\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrczq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrczq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-v9rdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:48Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.276239 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08dfc393-0ddb-4bde-9b1f-2a48549f4549\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ad07f9e8bc73e15364d59e6baf04c54550518537bac8cbfcf180b423a1da4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://beb9850d14f3c89154f1960550eed4073643c6b70832bb3958176ac924f5a9fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34752fc66f7ecfa5c0a9a761bc6a69d21ab1fa332ecd79948cb6b1ea34a8a8b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2215b819a38926d6f34fede86a32762a942f00bc552ca9c0b8df5f17ed47c10a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d89981c8d56dfb581289dc794c8402f5fde866fd6948c2c0a1940eb3ba99a320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://473ea0e8600e10ea2a695b213d69795c56c1a399cbc5dc82ddb34f7261a034e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://280568a03157d6636f59b89e6215d0de36b0dfc4b96ded402153364088f108b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86328f182f7db62c6133733a4a4b9171f479c5b134c11250bb1839234544f485\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cpfzk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:48Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.294061 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mlsb2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b79f01f-bf05-4f7d-b816-6ef01f21e949\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://415d35423a66454f5ed06ba8facb7adbda91fae370e4d1d8a022cf7002e7b8bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6tf6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mlsb2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:48Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.309218 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.309263 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.309276 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.309297 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.309312 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:48Z","lastTransitionTime":"2026-03-21T03:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.321287 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df1f675def99930733ba3d2e467e3c07307a00cd8a10f0c3dee668de18f92dbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:48Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.339237 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:48Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.354460 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:48Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.362730 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ztl6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dea34cf-d6c6-42fd-b4aa-8e175c6a78f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438ee4ce0e664eb23ee83b20eecbc81dd4f5ad12e50e98785be3696b58528952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sc2qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ztl6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:48Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.374086 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aab80bc7857e9a3d86f540c725f4e35d0d86f979abed1ea5c9cddb4a6263924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d981cec0491154ace10ce4f437d1101c8dd69c34129a42b7dd5af226f8b14b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:48Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.387038 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6xvsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5cb18bc-bda7-463e-98fe-6d8ff293b949\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d8efc9f2ad837ea5b63d39bb04316943c7562b645f8f78b12e54d430f2a4a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4d8efc9f2ad837ea5b63d39bb04316943c7562b645f8f78b12e54d430f2a4a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d2cd7978a2b2d5a571096a1f0172b76414dadc75fb54395e05d1f344f04e1a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d2cd7978a2b2d5a571096a1f0172b76414dadc75fb54395e05d1f344f04e1a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2188c1bcb129415bc4e9a76339bad4f973c3778a1c5a0735b15a8e0ccc17f5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2188c1bcb129415bc4e9a76339bad4f973c3778a1c5a0735b15a8e0ccc17f5ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1417d49afce3f61158d9158a6c9a55e186a1032fb3c4ad275c63b6c9df81dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1417d49afce3f61158d9158a6c9a55e186a1032fb3c4ad275c63b6c9df81dd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f6759044b25748d81e11825c925bc83b0caa70bc3bc5f930554ac6f8a19cabb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f6759044b25748d81e11825c925bc83b0caa70bc3bc5f930554ac6f8a19cabb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6xvsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:48Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.401974 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7jcm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd9b1743-6b69-46d3-a429-6f83bf43317a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a947393ce78ff4c9dddd93eb5ee7cb034673ebb6f0391cb868b473028074aa46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x266z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7jcm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:48Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.413170 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.413228 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.413246 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.413273 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.413292 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:48Z","lastTransitionTime":"2026-03-21T03:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.425300 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cea46fe2-4e41-43ab-a069-cb30fb4e732c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f8af9aaf0940271c159c00b04ffb7a1dd9471caa93ad4c7bd4507461a095923\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5ntg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://682dda84970818843427fd441cf0359877cff12a6a10a7f7fad13d60c5ca62f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5ntg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7r9cg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:48Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.440411 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cedbf5aa59177e67115d9dbf436805563022b6cebdd1c909d460d7997af5333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:48Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.454289 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mlsb2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b79f01f-bf05-4f7d-b816-6ef01f21e949\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://415d35423a66454f5ed06ba8facb7adbda91fae370e4d1d8a022cf7002e7b8bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6tf6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mlsb2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:48Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.471515 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df1f675def99930733ba3d2e467e3c07307a00cd8a10f0c3dee668de18f92dbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:48Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.484693 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:48Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.494385 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:48Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.502677 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-v9rdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fda9b1ff-e4a8-4d15-8f7b-2974991cd252\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrczq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrczq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-v9rdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:48Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.515531 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.515577 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.515586 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.515603 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.515614 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:48Z","lastTransitionTime":"2026-03-21T03:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.521825 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08dfc393-0ddb-4bde-9b1f-2a48549f4549\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ad07f9e8bc73e15364d59e6baf04c54550518537bac8cbfcf180b423a1da4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://beb9850d14f3c89154f1960550eed4073643c6b70832bb3958176ac924f5a9fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34752fc66f7ecfa5c0a9a761bc6a69d21ab1fa332ecd79948cb6b1ea34a8a8b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2215b819a38926d6f34fede86a32762a942f00bc552ca9c0b8df5f17ed47c10a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d89981c8d56dfb581289dc794c8402f5fde866fd6948c2c0a1940eb3ba99a320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://473ea0e8600e10ea2a695b213d69795c56c1a399cbc5dc82ddb34f7261a034e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://280568a03157d6636f59b89e6215d0de36b0dfc4b96ded402153364088f108b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86328f182f7db62c6133733a4a4b9171f479c5b134c11250bb1839234544f485\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cpfzk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:48Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.532216 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ztl6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dea34cf-d6c6-42fd-b4aa-8e175c6a78f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438ee4ce0e664eb23ee83b20eecbc81dd4f5ad12e50e98785be3696b58528952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sc2qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ztl6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:48Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.545237 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:48Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.554853 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jrr5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2521a678-ad6c-464b-bf7b-c4f6237c2822\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://749ea865e46b19e2475b94b696bb2be8bc27add3f2ba2420c38b9d67b2b8ddd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc29c132abc0c440dd96bf14118de27e46aa41322e61ce3d685eb43962330cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jrr5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:48Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.566932 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:48Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.577434 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jrr5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2521a678-ad6c-464b-bf7b-c4f6237c2822\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://749ea865e46b19e2475b94b696bb2be8bc27add3f2ba2420c38b9d67b2b8ddd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc29c132abc0c440dd96bf14118de27e46aa41322e61ce3d685eb43962330cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jrr5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:48Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.589012 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6xvsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5cb18bc-bda7-463e-98fe-6d8ff293b949\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d8efc9f2ad837ea5b63d39bb04316943c7562b645f8f78b12e54d430f2a4a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4d8efc9f2ad837ea5b63d39bb04316943c7562b645f8f78b12e54d430f2a4a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d2cd7978a2b2d5a571096a1f0172b76414dadc75fb54395e05d1f344f04e1a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d2cd7978a2b2d5a571096a1f0172b76414dadc75fb54395e05d1f344f04e1a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2188c1bcb129415bc4e9a76339bad4f973c3778a1c5a0735b15a8e0ccc17f5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2188c1bcb129415bc4e9a76339bad4f973c3778a1c5a0735b15a8e0ccc17f5ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1417d49afce3f61158d9158a6c9a55e186a1032fb3c4ad275c63b6c9df81dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1417d49afce3f61158d9158a6c9a55e186a1032fb3c4ad275c63b6c9df81dd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f6759044b25748d81e11825c925bc83b0caa70bc3bc5f930554ac6f8a19cabb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f6759044b25748d81e11825c925bc83b0caa70bc3bc5f930554ac6f8a19cabb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6xvsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:48Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.604427 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7jcm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd9b1743-6b69-46d3-a429-6f83bf43317a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a947393ce78ff4c9dddd93eb5ee7cb034673ebb6f0391cb868b473028074aa46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x266z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7jcm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:48Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.615254 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cea46fe2-4e41-43ab-a069-cb30fb4e732c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f8af9aaf0940271c159c00b04ffb7a1dd9471caa93ad4c7bd4507461a095923\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5ntg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://682dda84970818843427fd441cf0359877cff12a6a10a7f7fad13d60c5ca62f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5ntg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7r9cg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:48Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.625324 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.625406 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.625424 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.625448 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.625470 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:48Z","lastTransitionTime":"2026-03-21T03:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.635472 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aab80bc7857e9a3d86f540c725f4e35d0d86f979abed1ea5c9cddb4a6263924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d981cec0491154ace10ce4f437d1101c8dd69c34129a42b7dd5af226f8b14b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:48Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.648372 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cedbf5aa59177e67115d9dbf436805563022b6cebdd1c909d460d7997af5333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:48Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.659995 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df1f675def99930733ba3d2e467e3c07307a00cd8a10f0c3dee668de18f92dbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:48Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.670295 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:48Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.679669 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:48Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.689664 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-v9rdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fda9b1ff-e4a8-4d15-8f7b-2974991cd252\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrczq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrczq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-v9rdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:48Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.707445 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08dfc393-0ddb-4bde-9b1f-2a48549f4549\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ad07f9e8bc73e15364d59e6baf04c54550518537bac8cbfcf180b423a1da4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://beb9850d14f3c89154f1960550eed4073643c6b70832bb3958176ac924f5a9fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34752fc66f7ecfa5c0a9a761bc6a69d21ab1fa332ecd79948cb6b1ea34a8a8b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2215b819a38926d6f34fede86a32762a942f00bc552ca9c0b8df5f17ed47c10a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d89981c8d56dfb581289dc794c8402f5fde866fd6948c2c0a1940eb3ba99a320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://473ea0e8600e10ea2a695b213d69795c56c1a399cbc5dc82ddb34f7261a034e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://280568a03157d6636f59b89e6215d0de36b0dfc4b96ded402153364088f108b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86328f182f7db62c6133733a4a4b9171f479c5b134c11250bb1839234544f485\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cpfzk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:48Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.718302 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mlsb2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b79f01f-bf05-4f7d-b816-6ef01f21e949\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://415d35423a66454f5ed06ba8facb7adbda91fae370e4d1d8a022cf7002e7b8bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6tf6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mlsb2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:48Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.729474 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ztl6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dea34cf-d6c6-42fd-b4aa-8e175c6a78f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438ee4ce0e664eb23ee83b20eecbc81dd4f5ad12e50e98785be3696b58528952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sc2qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ztl6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:48Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.730103 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.730152 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.730174 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.730202 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.730222 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:48Z","lastTransitionTime":"2026-03-21T03:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.832802 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.832854 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.832863 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.832879 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.832889 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:48Z","lastTransitionTime":"2026-03-21T03:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.935422 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.935456 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.935470 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.935488 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:48 crc kubenswrapper[4685]: I0321 03:47:48.935499 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:48Z","lastTransitionTime":"2026-03-21T03:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.038575 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.038607 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.038618 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.038636 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.038649 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:49Z","lastTransitionTime":"2026-03-21T03:47:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.047715 4685 generic.go:334] "Generic (PLEG): container finished" podID="a5cb18bc-bda7-463e-98fe-6d8ff293b949" containerID="6eda6a197766562ad75d8ccf23a1e8998c02f7b6f2ac62fb7cc1ff8384b3d0cf" exitCode=0 Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.048930 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6xvsf" event={"ID":"a5cb18bc-bda7-463e-98fe-6d8ff293b949","Type":"ContainerDied","Data":"6eda6a197766562ad75d8ccf23a1e8998c02f7b6f2ac62fb7cc1ff8384b3d0cf"} Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.048976 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.049515 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.066428 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cedbf5aa59177e67115d9dbf436805563022b6cebdd1c909d460d7997af5333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:49Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.088670 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.089486 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:49Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.107054 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-v9rdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fda9b1ff-e4a8-4d15-8f7b-2974991cd252\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrczq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrczq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-v9rdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:49Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.127813 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08dfc393-0ddb-4bde-9b1f-2a48549f4549\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ad07f9e8bc73e15364d59e6baf04c54550518537bac8cbfcf180b423a1da4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://beb9850d14f3c89154f1960550eed4073643c6b70832bb3958176ac924f5a9fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34752fc66f7ecfa5c0a9a761bc6a69d21ab1fa332ecd79948cb6b1ea34a8a8b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2215b819a38926d6f34fede86a32762a942f00bc552ca9c0b8df5f17ed47c10a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d89981c8d56dfb581289dc794c8402f5fde866fd6948c2c0a1940eb3ba99a320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://473ea0e8600e10ea2a695b213d69795c56c1a399cbc5dc82ddb34f7261a034e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://280568a03157d6636f59b89e6215d0de36b0dfc4b96ded402153364088f108b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86328f182f7db62c6133733a4a4b9171f479c5b134c11250bb1839234544f485\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cpfzk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:49Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.137985 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mlsb2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b79f01f-bf05-4f7d-b816-6ef01f21e949\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://415d35423a66454f5ed06ba8facb7adbda91fae370e4d1d8a022cf7002e7b8bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6tf6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mlsb2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:49Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.141735 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.141784 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.141796 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.141816 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.141829 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:49Z","lastTransitionTime":"2026-03-21T03:47:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.150372 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df1f675def99930733ba3d2e467e3c07307a00cd8a10f0c3dee668de18f92dbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:49Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.162028 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:49Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.172338 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ztl6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dea34cf-d6c6-42fd-b4aa-8e175c6a78f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438ee4ce0e664eb23ee83b20eecbc81dd4f5ad12e50e98785be3696b58528952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sc2qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ztl6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:49Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.189652 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:49Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.204423 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jrr5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2521a678-ad6c-464b-bf7b-c4f6237c2822\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://749ea865e46b19e2475b94b696bb2be8bc27add3f2ba2420c38b9d67b2b8ddd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc29c132abc0c440dd96bf14118de27e46aa41322e61ce3d685eb43962330cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jrr5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:49Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.220152 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aab80bc7857e9a3d86f540c725f4e35d0d86f979abed1ea5c9cddb4a6263924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d981cec0491154ace10ce4f437d1101c8dd69c34129a42b7dd5af226f8b14b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:49Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.235759 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6xvsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5cb18bc-bda7-463e-98fe-6d8ff293b949\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d8efc9f2ad837ea5b63d39bb04316943c7562b645f8f78b12e54d430f2a4a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4d8efc9f2ad837ea5b63d39bb04316943c7562b645f8f78b12e54d430f2a4a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d2cd7978a2b2d5a571096a1f0172b76414dadc75fb54395e05d1f344f04e1a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d2cd7978a2b2d5a571096a1f0172b76414dadc75fb54395e05d1f344f04e1a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2188c1bcb129415bc4e9a76339bad4f973c3778a1c5a0735b15a8e0ccc17f5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2188c1bcb129415bc4e9a76339bad4f973c3778a1c5a0735b15a8e0ccc17f5ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1417d49afce3f61158d9158a6c9a55e186a1032fb3c4ad275c63b6c9df81dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1417d49afce3f61158d9158a6c9a55e186a1032fb3c4ad275c63b6c9df81dd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f6759044b25748d81e11825c925bc83b0caa70bc3bc5f930554ac6f8a19cabb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f6759044b25748d81e11825c925bc83b0caa70bc3bc5f930554ac6f8a19cabb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6eda6a197766562ad75d8ccf23a1e8998c02f7b6f2ac62fb7cc1ff8384b3d0cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6eda6a197766562ad75d8ccf23a1e8998c02f7b6f2ac62fb7cc1ff8384b3d0cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6xvsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:49Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.245859 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.245897 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.245908 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.245924 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.245937 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:49Z","lastTransitionTime":"2026-03-21T03:47:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.250005 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7jcm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd9b1743-6b69-46d3-a429-6f83bf43317a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a947393ce78ff4c9dddd93eb5ee7cb034673ebb6f0391cb868b473028074aa46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x266z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7jcm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:49Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.262095 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cea46fe2-4e41-43ab-a069-cb30fb4e732c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f8af9aaf0940271c159c00b04ffb7a1dd9471caa93ad4c7bd4507461a095923\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5ntg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://682dda84970818843427fd441cf0359877cff12a6a10a7f7fad13d60c5ca62f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5ntg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7r9cg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:49Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.271893 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-v9rdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fda9b1ff-e4a8-4d15-8f7b-2974991cd252\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrczq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrczq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-v9rdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:49Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.292289 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08dfc393-0ddb-4bde-9b1f-2a48549f4549\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ad07f9e8bc73e15364d59e6baf04c54550518537bac8cbfcf180b423a1da4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://beb9850d14f3c89154f1960550eed4073643c6b70832bb3958176ac924f5a9fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34752fc66f7ecfa5c0a9a761bc6a69d21ab1fa332ecd79948cb6b1ea34a8a8b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2215b819a38926d6f34fede86a32762a942f00bc552ca9c0b8df5f17ed47c10a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d89981c8d56dfb581289dc794c8402f5fde866fd6948c2c0a1940eb3ba99a320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://473ea0e8600e10ea2a695b213d69795c56c1a399cbc5dc82ddb34f7261a034e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://280568a03157d6636f59b89e6215d0de36b0dfc4b96ded402153364088f108b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86328f182f7db62c6133733a4a4b9171f479c5b134c11250bb1839234544f485\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cpfzk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:49Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.300608 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.300626 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.300608 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v9rdl" Mar 21 03:47:49 crc kubenswrapper[4685]: E0321 03:47:49.300713 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.300743 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 03:47:49 crc kubenswrapper[4685]: E0321 03:47:49.300798 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 03:47:49 crc kubenswrapper[4685]: E0321 03:47:49.300987 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v9rdl" podUID="fda9b1ff-e4a8-4d15-8f7b-2974991cd252" Mar 21 03:47:49 crc kubenswrapper[4685]: E0321 03:47:49.301046 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.303338 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mlsb2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b79f01f-bf05-4f7d-b816-6ef01f21e949\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://415d35423a66454f5ed06ba8facb7adbda91fae370e4d1d8a022cf7002e7b8bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6tf6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mlsb2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:49Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.319226 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df1f675def99930733ba3d2e467e3c07307a00cd8a10f0c3dee668de18f92dbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:49Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.337389 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:49Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.348770 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.348809 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.348823 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.348850 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.348861 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:49Z","lastTransitionTime":"2026-03-21T03:47:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.353679 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:49Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.364015 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ztl6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dea34cf-d6c6-42fd-b4aa-8e175c6a78f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438ee4ce0e664eb23ee83b20eecbc81dd4f5ad12e50e98785be3696b58528952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sc2qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ztl6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:49Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.375788 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:49Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.391210 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jrr5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2521a678-ad6c-464b-bf7b-c4f6237c2822\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://749ea865e46b19e2475b94b696bb2be8bc27add3f2ba2420c38b9d67b2b8ddd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc29c132abc0c440dd96bf14118de27e46aa41322e61ce3d685eb43962330cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jrr5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:49Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.405915 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aab80bc7857e9a3d86f540c725f4e35d0d86f979abed1ea5c9cddb4a6263924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d981cec0491154ace10ce4f437d1101c8dd69c34129a42b7dd5af226f8b14b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:49Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.426516 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6xvsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5cb18bc-bda7-463e-98fe-6d8ff293b949\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d8efc9f2ad837ea5b63d39bb04316943c7562b645f8f78b12e54d430f2a4a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4d8efc9f2ad837ea5b63d39bb04316943c7562b645f8f78b12e54d430f2a4a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d2cd7978a2b2d5a571096a1f0172b76414dadc75fb54395e05d1f344f04e1a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d2cd7978a2b2d5a571096a1f0172b76414dadc75fb54395e05d1f344f04e1a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2188c1bcb129415bc4e9a76339bad4f973c3778a1c5a0735b15a8e0ccc17f5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2188c1bcb129415bc4e9a76339bad4f973c3778a1c5a0735b15a8e0ccc17f5ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1417d49afce3f61158d9158a6c9a55e186a1032fb3c4ad275c63b6c9df81dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1417d49afce3f61158d9158a6c9a55e186a1032fb3c4ad275c63b6c9df81dd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f6759044b25748d81e11825c925bc83b0caa70bc3bc5f930554ac6f8a19cabb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f6759044b25748d81e11825c925bc83b0caa70bc3bc5f930554ac6f8a19cabb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6eda6a197766562ad75d8ccf23a1e8998c02f7b6f2ac62fb7cc1ff8384b3d0cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6eda6a197766562ad75d8ccf23a1e8998c02f7b6f2ac62fb7cc1ff8384b3d0cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6xvsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:49Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.441331 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7jcm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd9b1743-6b69-46d3-a429-6f83bf43317a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a947393ce78ff4c9dddd93eb5ee7cb034673ebb6f0391cb868b473028074aa46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x266z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7jcm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:49Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.453248 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.453296 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.453309 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.453329 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.453342 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:49Z","lastTransitionTime":"2026-03-21T03:47:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.457062 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cea46fe2-4e41-43ab-a069-cb30fb4e732c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f8af9aaf0940271c159c00b04ffb7a1dd9471caa93ad4c7bd4507461a095923\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5ntg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://682dda84970818843427fd441cf0359877cff12a6a10a7f7fad13d60c5ca62f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5ntg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7r9cg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:49Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.482198 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cedbf5aa59177e67115d9dbf436805563022b6cebdd1c909d460d7997af5333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:49Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.560419 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.560518 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.560539 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.560569 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.560595 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:49Z","lastTransitionTime":"2026-03-21T03:47:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.663429 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.663494 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.663508 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.663529 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.663543 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:49Z","lastTransitionTime":"2026-03-21T03:47:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.766721 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.766771 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.766780 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.766803 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.766818 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:49Z","lastTransitionTime":"2026-03-21T03:47:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.869535 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.869598 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.869613 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.869636 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.869652 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:49Z","lastTransitionTime":"2026-03-21T03:47:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.978602 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.978650 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.978660 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.978682 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.978693 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:49Z","lastTransitionTime":"2026-03-21T03:47:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.986411 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.986615 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.986731 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.986864 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:49 crc kubenswrapper[4685]: I0321 03:47:49.987022 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:49Z","lastTransitionTime":"2026-03-21T03:47:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:50 crc kubenswrapper[4685]: E0321 03:47:50.005942 4685 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:47:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:47:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:47:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:47:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8bd8455e-cd7f-4a01-9ab2-39696fd22c82\\\",\\\"systemUUID\\\":\\\"9e08e2bc-a5ba-4ecf-a5ec-5e30e6b4a1fa\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:50Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:50 crc kubenswrapper[4685]: I0321 03:47:50.011104 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:50 crc kubenswrapper[4685]: I0321 03:47:50.011154 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:50 crc kubenswrapper[4685]: I0321 03:47:50.011165 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:50 crc kubenswrapper[4685]: I0321 03:47:50.011189 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:50 crc kubenswrapper[4685]: I0321 03:47:50.011200 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:50Z","lastTransitionTime":"2026-03-21T03:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:50 crc kubenswrapper[4685]: E0321 03:47:50.029373 4685 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:47:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:47:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:47:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:47:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8bd8455e-cd7f-4a01-9ab2-39696fd22c82\\\",\\\"systemUUID\\\":\\\"9e08e2bc-a5ba-4ecf-a5ec-5e30e6b4a1fa\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:50Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:50 crc kubenswrapper[4685]: I0321 03:47:50.034980 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:50 crc kubenswrapper[4685]: I0321 03:47:50.035255 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:50 crc kubenswrapper[4685]: I0321 03:47:50.035394 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:50 crc kubenswrapper[4685]: I0321 03:47:50.035527 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:50 crc kubenswrapper[4685]: I0321 03:47:50.035648 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:50Z","lastTransitionTime":"2026-03-21T03:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:50 crc kubenswrapper[4685]: E0321 03:47:50.053802 4685 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:47:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:47:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:47:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:47:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8bd8455e-cd7f-4a01-9ab2-39696fd22c82\\\",\\\"systemUUID\\\":\\\"9e08e2bc-a5ba-4ecf-a5ec-5e30e6b4a1fa\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:50Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:50 crc kubenswrapper[4685]: I0321 03:47:50.056623 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6xvsf" event={"ID":"a5cb18bc-bda7-463e-98fe-6d8ff293b949","Type":"ContainerStarted","Data":"7b3a945a5915df9df43b15e02b30e3ffa4b0f0dd4e9283d54f80b4adb56f368c"} Mar 21 03:47:50 crc kubenswrapper[4685]: I0321 03:47:50.058579 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:50 crc kubenswrapper[4685]: I0321 03:47:50.058617 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:50 crc kubenswrapper[4685]: I0321 03:47:50.058630 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:50 crc kubenswrapper[4685]: I0321 03:47:50.058651 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:50 crc kubenswrapper[4685]: I0321 03:47:50.058666 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:50Z","lastTransitionTime":"2026-03-21T03:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:50 crc kubenswrapper[4685]: I0321 03:47:50.074216 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:50Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:50 crc kubenswrapper[4685]: E0321 03:47:50.077813 4685 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:47:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:47:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:47:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:47:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8bd8455e-cd7f-4a01-9ab2-39696fd22c82\\\",\\\"systemUUID\\\":\\\"9e08e2bc-a5ba-4ecf-a5ec-5e30e6b4a1fa\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:50Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:50 crc kubenswrapper[4685]: I0321 03:47:50.082735 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:50 crc kubenswrapper[4685]: I0321 03:47:50.082792 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:50 crc kubenswrapper[4685]: I0321 03:47:50.082806 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:50 crc kubenswrapper[4685]: I0321 03:47:50.082830 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:50 crc kubenswrapper[4685]: I0321 03:47:50.082888 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:50Z","lastTransitionTime":"2026-03-21T03:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:50 crc kubenswrapper[4685]: I0321 03:47:50.090453 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jrr5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2521a678-ad6c-464b-bf7b-c4f6237c2822\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://749ea865e46b19e2475b94b696bb2be8bc27add3f2ba2420c38b9d67b2b8ddd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc29c132abc0c440dd96bf14118de27e46aa41322e61ce3d685eb43962330cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jrr5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:50Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:50 crc kubenswrapper[4685]: E0321 03:47:50.098064 4685 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:47:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:47:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:47:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:47:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8bd8455e-cd7f-4a01-9ab2-39696fd22c82\\\",\\\"systemUUID\\\":\\\"9e08e2bc-a5ba-4ecf-a5ec-5e30e6b4a1fa\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:50Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:50 crc kubenswrapper[4685]: E0321 03:47:50.098231 4685 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 21 03:47:50 crc kubenswrapper[4685]: I0321 03:47:50.100536 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:50 crc kubenswrapper[4685]: I0321 03:47:50.100621 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:50 crc kubenswrapper[4685]: I0321 03:47:50.100640 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:50 crc kubenswrapper[4685]: I0321 03:47:50.100668 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:50 crc kubenswrapper[4685]: I0321 03:47:50.100691 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:50Z","lastTransitionTime":"2026-03-21T03:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:50 crc kubenswrapper[4685]: I0321 03:47:50.112942 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aab80bc7857e9a3d86f540c725f4e35d0d86f979abed1ea5c9cddb4a6263924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d981cec0491154ace10ce4f437d1101c8dd69c34129a42b7dd5af226f8b14b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:50Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:50 crc kubenswrapper[4685]: I0321 03:47:50.129690 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6xvsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5cb18bc-bda7-463e-98fe-6d8ff293b949\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b3a945a5915df9df43b15e02b30e3ffa4b0f0dd4e9283d54f80b4adb56f368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d8efc9f2ad837ea5b63d39bb04316943c7562b645f8f78b12e54d430f2a4a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4d8efc9f2ad837ea5b63d39bb04316943c7562b645f8f78b12e54d430f2a4a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d2cd7978a2b2d5a571096a1f0172b76414dadc75fb54395e05d1f344f04e1a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d2cd7978a2b2d5a571096a1f0172b76414dadc75fb54395e05d1f344f04e1a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2188c1bcb129415bc4e9a76339bad4f973c3778a1c5a0735b15a8e0ccc17f5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2188c1bcb129415bc4e9a76339bad4f973c3778a1c5a0735b15a8e0ccc17f5ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1417d49afce3f61158d9158a6c9a55e186a1032fb3c4ad275c63b6c9df81dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1417d49afce3f61158d9158a6c9a55e186a1032fb3c4ad275c63b6c9df81dd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f6759044b25748d81e11825c925bc83b0caa70bc3bc5f930554ac6f8a19cabb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f6759044b25748d81e11825c925bc83b0caa70bc3bc5f930554ac6f8a19cabb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6eda6a197766562ad75d8ccf23a1e8998c02f7b6f2ac62fb7cc1ff8384b3d0cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6eda6a197766562ad75d8ccf23a1e8998c02f7b6f2ac62fb7cc1ff8384b3d0cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6xvsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:50Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:50 crc kubenswrapper[4685]: I0321 03:47:50.149944 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7jcm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd9b1743-6b69-46d3-a429-6f83bf43317a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a947393ce78ff4c9dddd93eb5ee7cb034673ebb6f0391cb868b473028074aa46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x266z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7jcm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:50Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:50 crc kubenswrapper[4685]: I0321 03:47:50.165756 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cea46fe2-4e41-43ab-a069-cb30fb4e732c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f8af9aaf0940271c159c00b04ffb7a1dd9471caa93ad4c7bd4507461a095923\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5ntg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://682dda84970818843427fd441cf0359877cff12a6a10a7f7fad13d60c5ca62f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5ntg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7r9cg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:50Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:50 crc kubenswrapper[4685]: I0321 03:47:50.185765 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cedbf5aa59177e67115d9dbf436805563022b6cebdd1c909d460d7997af5333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:50Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:50 crc kubenswrapper[4685]: I0321 03:47:50.200425 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mlsb2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b79f01f-bf05-4f7d-b816-6ef01f21e949\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://415d35423a66454f5ed06ba8facb7adbda91fae370e4d1d8a022cf7002e7b8bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6tf6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mlsb2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:50Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:50 crc kubenswrapper[4685]: I0321 03:47:50.203480 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:50 crc kubenswrapper[4685]: I0321 03:47:50.203511 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:50 crc kubenswrapper[4685]: I0321 03:47:50.203523 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:50 crc kubenswrapper[4685]: I0321 03:47:50.203539 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:50 crc kubenswrapper[4685]: I0321 03:47:50.203549 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:50Z","lastTransitionTime":"2026-03-21T03:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:50 crc kubenswrapper[4685]: I0321 03:47:50.216937 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df1f675def99930733ba3d2e467e3c07307a00cd8a10f0c3dee668de18f92dbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:50Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:50 crc kubenswrapper[4685]: I0321 03:47:50.231465 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:50Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:50 crc kubenswrapper[4685]: I0321 03:47:50.244028 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:50Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:50 crc kubenswrapper[4685]: I0321 03:47:50.258656 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-v9rdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fda9b1ff-e4a8-4d15-8f7b-2974991cd252\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrczq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrczq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-v9rdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:50Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:50 crc kubenswrapper[4685]: I0321 03:47:50.284266 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08dfc393-0ddb-4bde-9b1f-2a48549f4549\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ad07f9e8bc73e15364d59e6baf04c54550518537bac8cbfcf180b423a1da4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://beb9850d14f3c89154f1960550eed4073643c6b70832bb3958176ac924f5a9fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34752fc66f7ecfa5c0a9a761bc6a69d21ab1fa332ecd79948cb6b1ea34a8a8b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2215b819a38926d6f34fede86a32762a942f00bc552ca9c0b8df5f17ed47c10a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d89981c8d56dfb581289dc794c8402f5fde866fd6948c2c0a1940eb3ba99a320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://473ea0e8600e10ea2a695b213d69795c56c1a399cbc5dc82ddb34f7261a034e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://280568a03157d6636f59b89e6215d0de36b0dfc4b96ded402153364088f108b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86328f182f7db62c6133733a4a4b9171f479c5b134c11250bb1839234544f485\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cpfzk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:50Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:50 crc kubenswrapper[4685]: I0321 03:47:50.298275 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ztl6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dea34cf-d6c6-42fd-b4aa-8e175c6a78f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438ee4ce0e664eb23ee83b20eecbc81dd4f5ad12e50e98785be3696b58528952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sc2qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ztl6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:50Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:50 crc kubenswrapper[4685]: I0321 03:47:50.306459 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:50 crc kubenswrapper[4685]: I0321 03:47:50.306534 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:50 crc kubenswrapper[4685]: I0321 03:47:50.306551 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:50 crc kubenswrapper[4685]: I0321 03:47:50.306589 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:50 crc kubenswrapper[4685]: I0321 03:47:50.306609 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:50Z","lastTransitionTime":"2026-03-21T03:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:50 crc kubenswrapper[4685]: I0321 03:47:50.409460 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:50 crc kubenswrapper[4685]: I0321 03:47:50.409531 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:50 crc kubenswrapper[4685]: I0321 03:47:50.409552 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:50 crc kubenswrapper[4685]: I0321 03:47:50.409583 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:50 crc kubenswrapper[4685]: I0321 03:47:50.409602 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:50Z","lastTransitionTime":"2026-03-21T03:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:50 crc kubenswrapper[4685]: I0321 03:47:50.512346 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:50 crc kubenswrapper[4685]: I0321 03:47:50.512411 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:50 crc kubenswrapper[4685]: I0321 03:47:50.512429 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:50 crc kubenswrapper[4685]: I0321 03:47:50.512455 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:50 crc kubenswrapper[4685]: I0321 03:47:50.512473 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:50Z","lastTransitionTime":"2026-03-21T03:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:50 crc kubenswrapper[4685]: I0321 03:47:50.617171 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:50 crc kubenswrapper[4685]: I0321 03:47:50.617295 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:50 crc kubenswrapper[4685]: I0321 03:47:50.617323 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:50 crc kubenswrapper[4685]: I0321 03:47:50.617356 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:50 crc kubenswrapper[4685]: I0321 03:47:50.617381 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:50Z","lastTransitionTime":"2026-03-21T03:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:50 crc kubenswrapper[4685]: I0321 03:47:50.728721 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:50 crc kubenswrapper[4685]: I0321 03:47:50.728808 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:50 crc kubenswrapper[4685]: I0321 03:47:50.728832 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:50 crc kubenswrapper[4685]: I0321 03:47:50.728899 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:50 crc kubenswrapper[4685]: I0321 03:47:50.728925 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:50Z","lastTransitionTime":"2026-03-21T03:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:50 crc kubenswrapper[4685]: I0321 03:47:50.833237 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:50 crc kubenswrapper[4685]: I0321 03:47:50.833308 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:50 crc kubenswrapper[4685]: I0321 03:47:50.833329 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:50 crc kubenswrapper[4685]: I0321 03:47:50.833400 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:50 crc kubenswrapper[4685]: I0321 03:47:50.833421 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:50Z","lastTransitionTime":"2026-03-21T03:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:50 crc kubenswrapper[4685]: I0321 03:47:50.937239 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:50 crc kubenswrapper[4685]: I0321 03:47:50.937301 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:50 crc kubenswrapper[4685]: I0321 03:47:50.937311 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:50 crc kubenswrapper[4685]: I0321 03:47:50.937330 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:50 crc kubenswrapper[4685]: I0321 03:47:50.937342 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:50Z","lastTransitionTime":"2026-03-21T03:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:51 crc kubenswrapper[4685]: I0321 03:47:51.057497 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:51 crc kubenswrapper[4685]: I0321 03:47:51.057576 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:51 crc kubenswrapper[4685]: I0321 03:47:51.057598 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:51 crc kubenswrapper[4685]: I0321 03:47:51.057631 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:51 crc kubenswrapper[4685]: I0321 03:47:51.057658 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:51Z","lastTransitionTime":"2026-03-21T03:47:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:51 crc kubenswrapper[4685]: I0321 03:47:51.062783 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cpfzk_08dfc393-0ddb-4bde-9b1f-2a48549f4549/ovnkube-controller/0.log" Mar 21 03:47:51 crc kubenswrapper[4685]: I0321 03:47:51.067185 4685 generic.go:334] "Generic (PLEG): container finished" podID="08dfc393-0ddb-4bde-9b1f-2a48549f4549" containerID="280568a03157d6636f59b89e6215d0de36b0dfc4b96ded402153364088f108b4" exitCode=1 Mar 21 03:47:51 crc kubenswrapper[4685]: I0321 03:47:51.067265 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" event={"ID":"08dfc393-0ddb-4bde-9b1f-2a48549f4549","Type":"ContainerDied","Data":"280568a03157d6636f59b89e6215d0de36b0dfc4b96ded402153364088f108b4"} Mar 21 03:47:51 crc kubenswrapper[4685]: I0321 03:47:51.069178 4685 scope.go:117] "RemoveContainer" containerID="280568a03157d6636f59b89e6215d0de36b0dfc4b96ded402153364088f108b4" Mar 21 03:47:51 crc kubenswrapper[4685]: I0321 03:47:51.091131 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:51Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:51 crc kubenswrapper[4685]: I0321 03:47:51.109778 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jrr5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2521a678-ad6c-464b-bf7b-c4f6237c2822\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://749ea865e46b19e2475b94b696bb2be8bc27add3f2ba2420c38b9d67b2b8ddd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc29c132abc0c440dd96bf14118de27e46aa41322e61ce3d685eb43962330cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jrr5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:51Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:51 crc kubenswrapper[4685]: I0321 03:47:51.130805 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aab80bc7857e9a3d86f540c725f4e35d0d86f979abed1ea5c9cddb4a6263924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d981cec0491154ace10ce4f437d1101c8dd69c34129a42b7dd5af226f8b14b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:51Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:51 crc kubenswrapper[4685]: I0321 03:47:51.155617 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6xvsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5cb18bc-bda7-463e-98fe-6d8ff293b949\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b3a945a5915df9df43b15e02b30e3ffa4b0f0dd4e9283d54f80b4adb56f368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d8efc9f2ad837ea5b63d39bb04316943c7562b645f8f78b12e54d430f2a4a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4d8efc9f2ad837ea5b63d39bb04316943c7562b645f8f78b12e54d430f2a4a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d2cd7978a2b2d5a571096a1f0172b76414dadc75fb54395e05d1f344f04e1a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d2cd7978a2b2d5a571096a1f0172b76414dadc75fb54395e05d1f344f04e1a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2188c1bcb129415bc4e9a76339bad4f973c3778a1c5a0735b15a8e0ccc17f5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2188c1bcb129415bc4e9a76339bad4f973c3778a1c5a0735b15a8e0ccc17f5ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1417d49afce3f61158d9158a6c9a55e186a1032fb3c4ad275c63b6c9df81dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1417d49afce3f61158d9158a6c9a55e186a1032fb3c4ad275c63b6c9df81dd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f6759044b25748d81e11825c925bc83b0caa70bc3bc5f930554ac6f8a19cabb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f6759044b25748d81e11825c925bc83b0caa70bc3bc5f930554ac6f8a19cabb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6eda6a197766562ad75d8ccf23a1e8998c02f7b6f2ac62fb7cc1ff8384b3d0cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6eda6a197766562ad75d8ccf23a1e8998c02f7b6f2ac62fb7cc1ff8384b3d0cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6xvsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:51Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:51 crc kubenswrapper[4685]: I0321 03:47:51.161951 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:51 crc kubenswrapper[4685]: I0321 03:47:51.162041 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:51 crc kubenswrapper[4685]: I0321 03:47:51.162061 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:51 crc kubenswrapper[4685]: I0321 03:47:51.162091 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:51 crc kubenswrapper[4685]: I0321 03:47:51.162112 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:51Z","lastTransitionTime":"2026-03-21T03:47:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:51 crc kubenswrapper[4685]: I0321 03:47:51.178304 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7jcm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd9b1743-6b69-46d3-a429-6f83bf43317a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a947393ce78ff4c9dddd93eb5ee7cb034673ebb6f0391cb868b473028074aa46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x266z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7jcm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:51Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:51 crc kubenswrapper[4685]: I0321 03:47:51.203404 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cea46fe2-4e41-43ab-a069-cb30fb4e732c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f8af9aaf0940271c159c00b04ffb7a1dd9471caa93ad4c7bd4507461a095923\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5ntg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://682dda84970818843427fd441cf0359877cff12a6a10a7f7fad13d60c5ca62f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5ntg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7r9cg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:51Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:51 crc kubenswrapper[4685]: I0321 03:47:51.232599 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cedbf5aa59177e67115d9dbf436805563022b6cebdd1c909d460d7997af5333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:51Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:51 crc kubenswrapper[4685]: I0321 03:47:51.259711 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df1f675def99930733ba3d2e467e3c07307a00cd8a10f0c3dee668de18f92dbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:51Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:51 crc kubenswrapper[4685]: I0321 03:47:51.266683 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:51 crc kubenswrapper[4685]: I0321 03:47:51.266749 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:51 crc kubenswrapper[4685]: I0321 03:47:51.266765 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:51 crc kubenswrapper[4685]: I0321 03:47:51.266790 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:51 crc kubenswrapper[4685]: I0321 03:47:51.266811 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:51Z","lastTransitionTime":"2026-03-21T03:47:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:51 crc kubenswrapper[4685]: I0321 03:47:51.287495 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:51Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:51 crc kubenswrapper[4685]: I0321 03:47:51.300723 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v9rdl" Mar 21 03:47:51 crc kubenswrapper[4685]: I0321 03:47:51.300808 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 03:47:51 crc kubenswrapper[4685]: I0321 03:47:51.300818 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 03:47:51 crc kubenswrapper[4685]: I0321 03:47:51.300818 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 03:47:51 crc kubenswrapper[4685]: E0321 03:47:51.301315 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 03:47:51 crc kubenswrapper[4685]: E0321 03:47:51.301538 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v9rdl" podUID="fda9b1ff-e4a8-4d15-8f7b-2974991cd252" Mar 21 03:47:51 crc kubenswrapper[4685]: E0321 03:47:51.301642 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 03:47:51 crc kubenswrapper[4685]: E0321 03:47:51.301817 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 03:47:51 crc kubenswrapper[4685]: I0321 03:47:51.309376 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:51Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:51 crc kubenswrapper[4685]: I0321 03:47:51.318413 4685 scope.go:117] "RemoveContainer" containerID="c896022c1e0726ccc5a54da9c432a0ff4082158ad76abcea60c0a1427c69134b" Mar 21 03:47:51 crc kubenswrapper[4685]: I0321 03:47:51.318533 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 21 03:47:51 crc kubenswrapper[4685]: E0321 03:47:51.318880 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 21 03:47:51 crc kubenswrapper[4685]: I0321 03:47:51.327903 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-v9rdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fda9b1ff-e4a8-4d15-8f7b-2974991cd252\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrczq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrczq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-v9rdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:51Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:51 crc kubenswrapper[4685]: I0321 03:47:51.348902 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08dfc393-0ddb-4bde-9b1f-2a48549f4549\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ad07f9e8bc73e15364d59e6baf04c54550518537bac8cbfcf180b423a1da4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://beb9850d14f3c89154f1960550eed4073643c6b70832bb3958176ac924f5a9fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34752fc66f7ecfa5c0a9a761bc6a69d21ab1fa332ecd79948cb6b1ea34a8a8b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2215b819a38926d6f34fede86a32762a942f00bc552ca9c0b8df5f17ed47c10a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d89981c8d56dfb581289dc794c8402f5fde866fd6948c2c0a1940eb3ba99a320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://473ea0e8600e10ea2a695b213d69795c56c1a399cbc5dc82ddb34f7261a034e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://280568a03157d6636f59b89e6215d0de36b0dfc4b96ded402153364088f108b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://280568a03157d6636f59b89e6215d0de36b0dfc4b96ded402153364088f108b4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T03:47:50Z\\\",\\\"message\\\":\\\"4 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0321 03:47:50.269487 6514 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0321 03:47:50.269756 6514 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0321 03:47:50.269792 6514 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0321 03:47:50.269893 6514 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0321 03:47:50.269901 6514 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0321 03:47:50.269910 6514 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0321 03:47:50.269924 6514 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0321 03:47:50.269937 6514 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0321 03:47:50.269957 6514 factory.go:656] Stopping watch factory\\\\nI0321 03:47:50.269979 6514 ovnkube.go:599] Stopped ovnkube\\\\nI0321 03:47:50.269983 6514 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0321 03:47:50.270014 6514 metrics.go:553] Stopping metrics server at address \\\\\\\"\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86328f182f7db62c6133733a4a4b9171f479c5b134c11250bb1839234544f485\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cpfzk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:51Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:51 crc kubenswrapper[4685]: I0321 03:47:51.361909 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mlsb2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b79f01f-bf05-4f7d-b816-6ef01f21e949\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://415d35423a66454f5ed06ba8facb7adbda91fae370e4d1d8a022cf7002e7b8bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6tf6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mlsb2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:51Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:51 crc kubenswrapper[4685]: I0321 03:47:51.369309 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:51 crc kubenswrapper[4685]: I0321 03:47:51.369371 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:51 crc kubenswrapper[4685]: I0321 03:47:51.369394 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:51 crc kubenswrapper[4685]: I0321 03:47:51.369421 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:51 crc kubenswrapper[4685]: I0321 03:47:51.369441 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:51Z","lastTransitionTime":"2026-03-21T03:47:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:51 crc kubenswrapper[4685]: I0321 03:47:51.378531 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ztl6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dea34cf-d6c6-42fd-b4aa-8e175c6a78f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438ee4ce0e664eb23ee83b20eecbc81dd4f5ad12e50e98785be3696b58528952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sc2qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ztl6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:51Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:51 crc kubenswrapper[4685]: I0321 03:47:51.472380 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:51 crc kubenswrapper[4685]: I0321 03:47:51.472427 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:51 crc kubenswrapper[4685]: I0321 03:47:51.472437 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:51 crc kubenswrapper[4685]: I0321 03:47:51.472455 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:51 crc kubenswrapper[4685]: I0321 03:47:51.472466 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:51Z","lastTransitionTime":"2026-03-21T03:47:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:51 crc kubenswrapper[4685]: I0321 03:47:51.576325 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:51 crc kubenswrapper[4685]: I0321 03:47:51.576392 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:51 crc kubenswrapper[4685]: I0321 03:47:51.576406 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:51 crc kubenswrapper[4685]: I0321 03:47:51.576431 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:51 crc kubenswrapper[4685]: I0321 03:47:51.576447 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:51Z","lastTransitionTime":"2026-03-21T03:47:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:51 crc kubenswrapper[4685]: I0321 03:47:51.680354 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:51 crc kubenswrapper[4685]: I0321 03:47:51.680417 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:51 crc kubenswrapper[4685]: I0321 03:47:51.680439 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:51 crc kubenswrapper[4685]: I0321 03:47:51.680469 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:51 crc kubenswrapper[4685]: I0321 03:47:51.680488 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:51Z","lastTransitionTime":"2026-03-21T03:47:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:51 crc kubenswrapper[4685]: I0321 03:47:51.784721 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:51 crc kubenswrapper[4685]: I0321 03:47:51.784782 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:51 crc kubenswrapper[4685]: I0321 03:47:51.784800 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:51 crc kubenswrapper[4685]: I0321 03:47:51.784826 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:51 crc kubenswrapper[4685]: I0321 03:47:51.784872 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:51Z","lastTransitionTime":"2026-03-21T03:47:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:51 crc kubenswrapper[4685]: I0321 03:47:51.901267 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:51 crc kubenswrapper[4685]: I0321 03:47:51.901330 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:51 crc kubenswrapper[4685]: I0321 03:47:51.901348 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:51 crc kubenswrapper[4685]: I0321 03:47:51.901376 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:51 crc kubenswrapper[4685]: I0321 03:47:51.901395 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:51Z","lastTransitionTime":"2026-03-21T03:47:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:52 crc kubenswrapper[4685]: I0321 03:47:52.003956 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:52 crc kubenswrapper[4685]: I0321 03:47:52.004013 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:52 crc kubenswrapper[4685]: I0321 03:47:52.004030 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:52 crc kubenswrapper[4685]: I0321 03:47:52.004052 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:52 crc kubenswrapper[4685]: I0321 03:47:52.004069 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:52Z","lastTransitionTime":"2026-03-21T03:47:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:52 crc kubenswrapper[4685]: I0321 03:47:52.071480 4685 scope.go:117] "RemoveContainer" containerID="c896022c1e0726ccc5a54da9c432a0ff4082158ad76abcea60c0a1427c69134b" Mar 21 03:47:52 crc kubenswrapper[4685]: E0321 03:47:52.071811 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 21 03:47:52 crc kubenswrapper[4685]: I0321 03:47:52.107143 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:52 crc kubenswrapper[4685]: I0321 03:47:52.107208 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:52 crc kubenswrapper[4685]: I0321 03:47:52.107226 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:52 crc kubenswrapper[4685]: I0321 03:47:52.107251 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:52 crc kubenswrapper[4685]: I0321 03:47:52.107269 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:52Z","lastTransitionTime":"2026-03-21T03:47:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:52 crc kubenswrapper[4685]: I0321 03:47:52.210251 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:52 crc kubenswrapper[4685]: I0321 03:47:52.210302 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:52 crc kubenswrapper[4685]: I0321 03:47:52.210313 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:52 crc kubenswrapper[4685]: I0321 03:47:52.210335 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:52 crc kubenswrapper[4685]: I0321 03:47:52.210348 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:52Z","lastTransitionTime":"2026-03-21T03:47:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:52 crc kubenswrapper[4685]: I0321 03:47:52.313172 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:52 crc kubenswrapper[4685]: I0321 03:47:52.313226 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:52 crc kubenswrapper[4685]: I0321 03:47:52.313238 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:52 crc kubenswrapper[4685]: I0321 03:47:52.313259 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:52 crc kubenswrapper[4685]: I0321 03:47:52.313279 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:52Z","lastTransitionTime":"2026-03-21T03:47:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:52 crc kubenswrapper[4685]: I0321 03:47:52.416485 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:52 crc kubenswrapper[4685]: I0321 03:47:52.416533 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:52 crc kubenswrapper[4685]: I0321 03:47:52.416546 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:52 crc kubenswrapper[4685]: I0321 03:47:52.416565 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:52 crc kubenswrapper[4685]: I0321 03:47:52.416579 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:52Z","lastTransitionTime":"2026-03-21T03:47:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:52 crc kubenswrapper[4685]: I0321 03:47:52.519550 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:52 crc kubenswrapper[4685]: I0321 03:47:52.519584 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:52 crc kubenswrapper[4685]: I0321 03:47:52.519592 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:52 crc kubenswrapper[4685]: I0321 03:47:52.519605 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:52 crc kubenswrapper[4685]: I0321 03:47:52.519615 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:52Z","lastTransitionTime":"2026-03-21T03:47:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:52 crc kubenswrapper[4685]: I0321 03:47:52.622607 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:52 crc kubenswrapper[4685]: I0321 03:47:52.622667 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:52 crc kubenswrapper[4685]: I0321 03:47:52.622687 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:52 crc kubenswrapper[4685]: I0321 03:47:52.622714 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:52 crc kubenswrapper[4685]: I0321 03:47:52.622733 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:52Z","lastTransitionTime":"2026-03-21T03:47:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:52 crc kubenswrapper[4685]: I0321 03:47:52.726200 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:52 crc kubenswrapper[4685]: I0321 03:47:52.726252 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:52 crc kubenswrapper[4685]: I0321 03:47:52.726264 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:52 crc kubenswrapper[4685]: I0321 03:47:52.726286 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:52 crc kubenswrapper[4685]: I0321 03:47:52.726300 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:52Z","lastTransitionTime":"2026-03-21T03:47:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:52 crc kubenswrapper[4685]: I0321 03:47:52.829592 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:52 crc kubenswrapper[4685]: I0321 03:47:52.829635 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:52 crc kubenswrapper[4685]: I0321 03:47:52.829644 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:52 crc kubenswrapper[4685]: I0321 03:47:52.829660 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:52 crc kubenswrapper[4685]: I0321 03:47:52.829673 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:52Z","lastTransitionTime":"2026-03-21T03:47:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:52 crc kubenswrapper[4685]: I0321 03:47:52.932668 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:52 crc kubenswrapper[4685]: I0321 03:47:52.932720 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:52 crc kubenswrapper[4685]: I0321 03:47:52.932738 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:52 crc kubenswrapper[4685]: I0321 03:47:52.932764 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:52 crc kubenswrapper[4685]: I0321 03:47:52.932782 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:52Z","lastTransitionTime":"2026-03-21T03:47:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:53 crc kubenswrapper[4685]: I0321 03:47:53.035548 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:53 crc kubenswrapper[4685]: I0321 03:47:53.035602 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:53 crc kubenswrapper[4685]: I0321 03:47:53.035619 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:53 crc kubenswrapper[4685]: I0321 03:47:53.035646 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:53 crc kubenswrapper[4685]: I0321 03:47:53.035669 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:53Z","lastTransitionTime":"2026-03-21T03:47:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:53 crc kubenswrapper[4685]: I0321 03:47:53.077577 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cpfzk_08dfc393-0ddb-4bde-9b1f-2a48549f4549/ovnkube-controller/0.log" Mar 21 03:47:53 crc kubenswrapper[4685]: I0321 03:47:53.081568 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" event={"ID":"08dfc393-0ddb-4bde-9b1f-2a48549f4549","Type":"ContainerStarted","Data":"1017f2ba3943e2b7de1a63f4eebf7771e232b07b86bd3208e81bd5d193b31d25"} Mar 21 03:47:53 crc kubenswrapper[4685]: I0321 03:47:53.082244 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" Mar 21 03:47:53 crc kubenswrapper[4685]: I0321 03:47:53.114520 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aab80bc7857e9a3d86f540c725f4e35d0d86f979abed1ea5c9cddb4a6263924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d981cec0491154ace10ce4f437d1101c8dd69c34129a42b7dd5af226f8b14b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:53Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:53 crc kubenswrapper[4685]: I0321 03:47:53.138870 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:53 crc kubenswrapper[4685]: I0321 03:47:53.138929 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:53 crc kubenswrapper[4685]: I0321 03:47:53.138945 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:53 crc kubenswrapper[4685]: I0321 03:47:53.138971 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:53 crc kubenswrapper[4685]: I0321 03:47:53.138990 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:53Z","lastTransitionTime":"2026-03-21T03:47:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:53 crc kubenswrapper[4685]: I0321 03:47:53.152160 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6xvsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5cb18bc-bda7-463e-98fe-6d8ff293b949\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b3a945a5915df9df43b15e02b30e3ffa4b0f0dd4e9283d54f80b4adb56f368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d8efc9f2ad837ea5b63d39bb04316943c7562b645f8f78b12e54d430f2a4a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4d8efc9f2ad837ea5b63d39bb04316943c7562b645f8f78b12e54d430f2a4a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d2cd7978a2b2d5a571096a1f0172b76414dadc75fb54395e05d1f344f04e1a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d2cd7978a2b2d5a571096a1f0172b76414dadc75fb54395e05d1f344f04e1a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2188c1bcb129415bc4e9a76339bad4f973c3778a1c5a0735b15a8e0ccc17f5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2188c1bcb129415bc4e9a76339bad4f973c3778a1c5a0735b15a8e0ccc17f5ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1417d49afce3f61158d9158a6c9a55e186a1032fb3c4ad275c63b6c9df81dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1417d49afce3f61158d9158a6c9a55e186a1032fb3c4ad275c63b6c9df81dd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f6759044b25748d81e11825c925bc83b0caa70bc3bc5f930554ac6f8a19cabb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f6759044b25748d81e11825c925bc83b0caa70bc3bc5f930554ac6f8a19cabb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6eda6a197766562ad75d8ccf23a1e8998c02f7b6f2ac62fb7cc1ff8384b3d0cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6eda6a197766562ad75d8ccf23a1e8998c02f7b6f2ac62fb7cc1ff8384b3d0cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6xvsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:53Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:53 crc kubenswrapper[4685]: I0321 03:47:53.180059 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7jcm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd9b1743-6b69-46d3-a429-6f83bf43317a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a947393ce78ff4c9dddd93eb5ee7cb034673ebb6f0391cb868b473028074aa46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x266z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7jcm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:53Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:53 crc kubenswrapper[4685]: I0321 03:47:53.209423 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cea46fe2-4e41-43ab-a069-cb30fb4e732c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f8af9aaf0940271c159c00b04ffb7a1dd9471caa93ad4c7bd4507461a095923\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5ntg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://682dda84970818843427fd441cf0359877cff12a6a10a7f7fad13d60c5ca62f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5ntg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7r9cg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:53Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:53 crc kubenswrapper[4685]: I0321 03:47:53.228523 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fd9d618-b4ed-4942-b915-76dc59fb834a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba587c4fe2f05966282b50ba5236b9f3d9ef6de63f72c70ae9f7a5222cb8b904\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60c96edd458d05f217a2e9f07a44bd221303d821a790382a82cff0b912d48f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8583e9b5dff82d5df52e281ba4069e9259b1c8fe3d1b8121d0e9f3f9e97d47b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c896022c1e0726ccc5a54da9c432a0ff4082158ad76abcea60c0a1427c69134b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c896022c1e0726ccc5a54da9c432a0ff4082158ad76abcea60c0a1427c69134b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T03:47:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 03:47:19.136464 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 03:47:19.136735 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 03:47:19.138007 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2287411523/tls.crt::/tmp/serving-cert-2287411523/tls.key\\\\\\\"\\\\nI0321 03:47:19.320330 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 03:47:19.322862 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 03:47:19.322879 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 03:47:19.322902 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 03:47:19.322907 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 03:47:19.326922 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0321 03:47:19.326936 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 03:47:19.326989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 03:47:19.326999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 03:47:19.327009 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0321 03:47:19.327019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 03:47:19.327030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 03:47:19.327039 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 03:47:19.328267 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:18Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://182ad351b6c632ea64087d4784ea919d5c21165dc5d0373fa35db7f7f1eea435\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52df5756c4d3e8d1087a5a38b3e2b9966fb492d666c43773aa12a4cd83f6158d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52df5756c4d3e8d1087a5a38b3e2b9966fb492d666c43773aa12a4cd83f6158d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:46:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:46:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:46:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:53Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:53 crc kubenswrapper[4685]: I0321 03:47:53.241760 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:53 crc kubenswrapper[4685]: I0321 03:47:53.241795 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:53 crc kubenswrapper[4685]: I0321 03:47:53.241830 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:53 crc kubenswrapper[4685]: I0321 03:47:53.241874 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:53 crc kubenswrapper[4685]: I0321 03:47:53.241887 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:53Z","lastTransitionTime":"2026-03-21T03:47:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:53 crc kubenswrapper[4685]: I0321 03:47:53.244125 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cedbf5aa59177e67115d9dbf436805563022b6cebdd1c909d460d7997af5333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:53Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:53 crc kubenswrapper[4685]: I0321 03:47:53.286088 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08dfc393-0ddb-4bde-9b1f-2a48549f4549\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ad07f9e8bc73e15364d59e6baf04c54550518537bac8cbfcf180b423a1da4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://beb9850d14f3c89154f1960550eed4073643c6b70832bb3958176ac924f5a9fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34752fc66f7ecfa5c0a9a761bc6a69d21ab1fa332ecd79948cb6b1ea34a8a8b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2215b819a38926d6f34fede86a32762a942f00bc552ca9c0b8df5f17ed47c10a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d89981c8d56dfb581289dc794c8402f5fde866fd6948c2c0a1940eb3ba99a320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://473ea0e8600e10ea2a695b213d69795c56c1a399cbc5dc82ddb34f7261a034e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1017f2ba3943e2b7de1a63f4eebf7771e232b07b86bd3208e81bd5d193b31d25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://280568a03157d6636f59b89e6215d0de36b0dfc4b96ded402153364088f108b4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T03:47:50Z\\\",\\\"message\\\":\\\"4 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0321 03:47:50.269487 6514 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0321 03:47:50.269756 6514 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0321 03:47:50.269792 6514 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0321 03:47:50.269893 6514 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0321 03:47:50.269901 6514 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0321 03:47:50.269910 6514 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0321 03:47:50.269924 6514 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0321 03:47:50.269937 6514 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0321 03:47:50.269957 6514 factory.go:656] Stopping watch factory\\\\nI0321 03:47:50.269979 6514 ovnkube.go:599] Stopped ovnkube\\\\nI0321 03:47:50.269983 6514 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0321 03:47:50.270014 6514 metrics.go:553] Stopping metrics server at address \\\\\\\"\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86328f182f7db62c6133733a4a4b9171f479c5b134c11250bb1839234544f485\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cpfzk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:53Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:53 crc kubenswrapper[4685]: I0321 03:47:53.300044 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v9rdl" Mar 21 03:47:53 crc kubenswrapper[4685]: I0321 03:47:53.300158 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 03:47:53 crc kubenswrapper[4685]: E0321 03:47:53.300290 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v9rdl" podUID="fda9b1ff-e4a8-4d15-8f7b-2974991cd252" Mar 21 03:47:53 crc kubenswrapper[4685]: I0321 03:47:53.300679 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 03:47:53 crc kubenswrapper[4685]: I0321 03:47:53.300719 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 03:47:53 crc kubenswrapper[4685]: E0321 03:47:53.300781 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 03:47:53 crc kubenswrapper[4685]: E0321 03:47:53.300861 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 03:47:53 crc kubenswrapper[4685]: E0321 03:47:53.300931 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 03:47:53 crc kubenswrapper[4685]: I0321 03:47:53.310425 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mlsb2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b79f01f-bf05-4f7d-b816-6ef01f21e949\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://415d35423a66454f5ed06ba8facb7adbda91fae370e4d1d8a022cf7002e7b8bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6tf6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mlsb2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:53Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:53 crc kubenswrapper[4685]: I0321 03:47:53.323321 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df1f675def99930733ba3d2e467e3c07307a00cd8a10f0c3dee668de18f92dbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:53Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:53 crc kubenswrapper[4685]: I0321 03:47:53.335956 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:53Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:53 crc kubenswrapper[4685]: I0321 03:47:53.343951 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:53 crc kubenswrapper[4685]: I0321 03:47:53.343985 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:53 crc kubenswrapper[4685]: I0321 03:47:53.343997 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:53 crc kubenswrapper[4685]: I0321 03:47:53.344013 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:53 crc kubenswrapper[4685]: I0321 03:47:53.344025 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:53Z","lastTransitionTime":"2026-03-21T03:47:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:53 crc kubenswrapper[4685]: I0321 03:47:53.348330 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:53Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:53 crc kubenswrapper[4685]: I0321 03:47:53.360180 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-v9rdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fda9b1ff-e4a8-4d15-8f7b-2974991cd252\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrczq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrczq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-v9rdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:53Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:53 crc kubenswrapper[4685]: I0321 03:47:53.372129 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ztl6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dea34cf-d6c6-42fd-b4aa-8e175c6a78f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438ee4ce0e664eb23ee83b20eecbc81dd4f5ad12e50e98785be3696b58528952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sc2qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ztl6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:53Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:53 crc kubenswrapper[4685]: I0321 03:47:53.385009 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:53Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:53 crc kubenswrapper[4685]: I0321 03:47:53.396873 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jrr5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2521a678-ad6c-464b-bf7b-c4f6237c2822\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://749ea865e46b19e2475b94b696bb2be8bc27add3f2ba2420c38b9d67b2b8ddd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc29c132abc0c440dd96bf14118de27e46aa41322e61ce3d685eb43962330cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jrr5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:53Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:53 crc kubenswrapper[4685]: I0321 03:47:53.445609 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:53 crc kubenswrapper[4685]: I0321 03:47:53.445656 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:53 crc kubenswrapper[4685]: I0321 03:47:53.445665 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:53 crc kubenswrapper[4685]: I0321 03:47:53.445683 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:53 crc kubenswrapper[4685]: I0321 03:47:53.445695 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:53Z","lastTransitionTime":"2026-03-21T03:47:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:53 crc kubenswrapper[4685]: I0321 03:47:53.548226 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:53 crc kubenswrapper[4685]: I0321 03:47:53.548281 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:53 crc kubenswrapper[4685]: I0321 03:47:53.548290 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:53 crc kubenswrapper[4685]: I0321 03:47:53.548310 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:53 crc kubenswrapper[4685]: I0321 03:47:53.548324 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:53Z","lastTransitionTime":"2026-03-21T03:47:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:53 crc kubenswrapper[4685]: I0321 03:47:53.650583 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:53 crc kubenswrapper[4685]: I0321 03:47:53.650637 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:53 crc kubenswrapper[4685]: I0321 03:47:53.650648 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:53 crc kubenswrapper[4685]: I0321 03:47:53.650667 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:53 crc kubenswrapper[4685]: I0321 03:47:53.650679 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:53Z","lastTransitionTime":"2026-03-21T03:47:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:53 crc kubenswrapper[4685]: I0321 03:47:53.753773 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:53 crc kubenswrapper[4685]: I0321 03:47:53.753802 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:53 crc kubenswrapper[4685]: I0321 03:47:53.753812 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:53 crc kubenswrapper[4685]: I0321 03:47:53.753826 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:53 crc kubenswrapper[4685]: I0321 03:47:53.753859 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:53Z","lastTransitionTime":"2026-03-21T03:47:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:53 crc kubenswrapper[4685]: I0321 03:47:53.856028 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:53 crc kubenswrapper[4685]: I0321 03:47:53.856061 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:53 crc kubenswrapper[4685]: I0321 03:47:53.856119 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:53 crc kubenswrapper[4685]: I0321 03:47:53.856139 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:53 crc kubenswrapper[4685]: I0321 03:47:53.856151 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:53Z","lastTransitionTime":"2026-03-21T03:47:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:53 crc kubenswrapper[4685]: I0321 03:47:53.958291 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:53 crc kubenswrapper[4685]: I0321 03:47:53.958339 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:53 crc kubenswrapper[4685]: I0321 03:47:53.958357 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:53 crc kubenswrapper[4685]: I0321 03:47:53.958380 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:53 crc kubenswrapper[4685]: I0321 03:47:53.958396 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:53Z","lastTransitionTime":"2026-03-21T03:47:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:54 crc kubenswrapper[4685]: I0321 03:47:54.062963 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:54 crc kubenswrapper[4685]: I0321 03:47:54.063038 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:54 crc kubenswrapper[4685]: I0321 03:47:54.063066 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:54 crc kubenswrapper[4685]: I0321 03:47:54.063096 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:54 crc kubenswrapper[4685]: I0321 03:47:54.063115 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:54Z","lastTransitionTime":"2026-03-21T03:47:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:54 crc kubenswrapper[4685]: I0321 03:47:54.088295 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cpfzk_08dfc393-0ddb-4bde-9b1f-2a48549f4549/ovnkube-controller/1.log" Mar 21 03:47:54 crc kubenswrapper[4685]: I0321 03:47:54.089095 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cpfzk_08dfc393-0ddb-4bde-9b1f-2a48549f4549/ovnkube-controller/0.log" Mar 21 03:47:54 crc kubenswrapper[4685]: I0321 03:47:54.093641 4685 generic.go:334] "Generic (PLEG): container finished" podID="08dfc393-0ddb-4bde-9b1f-2a48549f4549" containerID="1017f2ba3943e2b7de1a63f4eebf7771e232b07b86bd3208e81bd5d193b31d25" exitCode=1 Mar 21 03:47:54 crc kubenswrapper[4685]: I0321 03:47:54.093760 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" event={"ID":"08dfc393-0ddb-4bde-9b1f-2a48549f4549","Type":"ContainerDied","Data":"1017f2ba3943e2b7de1a63f4eebf7771e232b07b86bd3208e81bd5d193b31d25"} Mar 21 03:47:54 crc kubenswrapper[4685]: I0321 03:47:54.094048 4685 scope.go:117] "RemoveContainer" containerID="280568a03157d6636f59b89e6215d0de36b0dfc4b96ded402153364088f108b4" Mar 21 03:47:54 crc kubenswrapper[4685]: I0321 03:47:54.095171 4685 scope.go:117] "RemoveContainer" containerID="1017f2ba3943e2b7de1a63f4eebf7771e232b07b86bd3208e81bd5d193b31d25" Mar 21 03:47:54 crc kubenswrapper[4685]: E0321 03:47:54.095629 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-cpfzk_openshift-ovn-kubernetes(08dfc393-0ddb-4bde-9b1f-2a48549f4549)\"" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" podUID="08dfc393-0ddb-4bde-9b1f-2a48549f4549" Mar 21 03:47:54 crc kubenswrapper[4685]: I0321 03:47:54.117334 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:54Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:54 crc kubenswrapper[4685]: I0321 03:47:54.132045 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jrr5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2521a678-ad6c-464b-bf7b-c4f6237c2822\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://749ea865e46b19e2475b94b696bb2be8bc27add3f2ba2420c38b9d67b2b8ddd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc29c132abc0c440dd96bf14118de27e46aa41322e61ce3d685eb43962330cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jrr5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:54Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:54 crc kubenswrapper[4685]: I0321 03:47:54.144942 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aab80bc7857e9a3d86f540c725f4e35d0d86f979abed1ea5c9cddb4a6263924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d981cec0491154ace10ce4f437d1101c8dd69c34129a42b7dd5af226f8b14b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:54Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:54 crc kubenswrapper[4685]: I0321 03:47:54.160599 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6xvsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5cb18bc-bda7-463e-98fe-6d8ff293b949\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b3a945a5915df9df43b15e02b30e3ffa4b0f0dd4e9283d54f80b4adb56f368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d8efc9f2ad837ea5b63d39bb04316943c7562b645f8f78b12e54d430f2a4a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4d8efc9f2ad837ea5b63d39bb04316943c7562b645f8f78b12e54d430f2a4a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d2cd7978a2b2d5a571096a1f0172b76414dadc75fb54395e05d1f344f04e1a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d2cd7978a2b2d5a571096a1f0172b76414dadc75fb54395e05d1f344f04e1a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2188c1bcb129415bc4e9a76339bad4f973c3778a1c5a0735b15a8e0ccc17f5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2188c1bcb129415bc4e9a76339bad4f973c3778a1c5a0735b15a8e0ccc17f5ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1417d49afce3f61158d9158a6c9a55e186a1032fb3c4ad275c63b6c9df81dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1417d49afce3f61158d9158a6c9a55e186a1032fb3c4ad275c63b6c9df81dd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f6759044b25748d81e11825c925bc83b0caa70bc3bc5f930554ac6f8a19cabb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f6759044b25748d81e11825c925bc83b0caa70bc3bc5f930554ac6f8a19cabb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6eda6a197766562ad75d8ccf23a1e8998c02f7b6f2ac62fb7cc1ff8384b3d0cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6eda6a197766562ad75d8ccf23a1e8998c02f7b6f2ac62fb7cc1ff8384b3d0cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6xvsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:54Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:54 crc kubenswrapper[4685]: I0321 03:47:54.169484 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:54 crc kubenswrapper[4685]: I0321 03:47:54.169598 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:54 crc kubenswrapper[4685]: I0321 03:47:54.169617 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:54 crc kubenswrapper[4685]: I0321 03:47:54.169655 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:54 crc kubenswrapper[4685]: I0321 03:47:54.169675 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:54Z","lastTransitionTime":"2026-03-21T03:47:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:54 crc kubenswrapper[4685]: I0321 03:47:54.178167 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7jcm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd9b1743-6b69-46d3-a429-6f83bf43317a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a947393ce78ff4c9dddd93eb5ee7cb034673ebb6f0391cb868b473028074aa46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x266z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7jcm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:54Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:54 crc kubenswrapper[4685]: I0321 03:47:54.189984 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cea46fe2-4e41-43ab-a069-cb30fb4e732c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f8af9aaf0940271c159c00b04ffb7a1dd9471caa93ad4c7bd4507461a095923\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5ntg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://682dda84970818843427fd441cf0359877cff12a6a10a7f7fad13d60c5ca62f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5ntg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7r9cg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:54Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:54 crc kubenswrapper[4685]: I0321 03:47:54.203100 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fd9d618-b4ed-4942-b915-76dc59fb834a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba587c4fe2f05966282b50ba5236b9f3d9ef6de63f72c70ae9f7a5222cb8b904\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60c96edd458d05f217a2e9f07a44bd221303d821a790382a82cff0b912d48f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8583e9b5dff82d5df52e281ba4069e9259b1c8fe3d1b8121d0e9f3f9e97d47b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c896022c1e0726ccc5a54da9c432a0ff4082158ad76abcea60c0a1427c69134b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c896022c1e0726ccc5a54da9c432a0ff4082158ad76abcea60c0a1427c69134b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T03:47:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 03:47:19.136464 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 03:47:19.136735 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 03:47:19.138007 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2287411523/tls.crt::/tmp/serving-cert-2287411523/tls.key\\\\\\\"\\\\nI0321 03:47:19.320330 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 03:47:19.322862 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 03:47:19.322879 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 03:47:19.322902 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 03:47:19.322907 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 03:47:19.326922 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0321 03:47:19.326936 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 03:47:19.326989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 03:47:19.326999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 03:47:19.327009 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0321 03:47:19.327019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 03:47:19.327030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 03:47:19.327039 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 03:47:19.328267 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:18Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://182ad351b6c632ea64087d4784ea919d5c21165dc5d0373fa35db7f7f1eea435\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52df5756c4d3e8d1087a5a38b3e2b9966fb492d666c43773aa12a4cd83f6158d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52df5756c4d3e8d1087a5a38b3e2b9966fb492d666c43773aa12a4cd83f6158d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:46:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:46:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:46:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:54Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:54 crc kubenswrapper[4685]: I0321 03:47:54.215629 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cedbf5aa59177e67115d9dbf436805563022b6cebdd1c909d460d7997af5333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:54Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:54 crc kubenswrapper[4685]: I0321 03:47:54.235957 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08dfc393-0ddb-4bde-9b1f-2a48549f4549\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ad07f9e8bc73e15364d59e6baf04c54550518537bac8cbfcf180b423a1da4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://beb9850d14f3c89154f1960550eed4073643c6b70832bb3958176ac924f5a9fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34752fc66f7ecfa5c0a9a761bc6a69d21ab1fa332ecd79948cb6b1ea34a8a8b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2215b819a38926d6f34fede86a32762a942f00bc552ca9c0b8df5f17ed47c10a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d89981c8d56dfb581289dc794c8402f5fde866fd6948c2c0a1940eb3ba99a320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://473ea0e8600e10ea2a695b213d69795c56c1a399cbc5dc82ddb34f7261a034e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1017f2ba3943e2b7de1a63f4eebf7771e232b07b86bd3208e81bd5d193b31d25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://280568a03157d6636f59b89e6215d0de36b0dfc4b96ded402153364088f108b4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T03:47:50Z\\\",\\\"message\\\":\\\"4 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0321 03:47:50.269487 6514 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0321 03:47:50.269756 6514 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0321 03:47:50.269792 6514 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0321 03:47:50.269893 6514 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0321 03:47:50.269901 6514 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0321 03:47:50.269910 6514 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0321 03:47:50.269924 6514 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0321 03:47:50.269937 6514 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0321 03:47:50.269957 6514 factory.go:656] Stopping watch factory\\\\nI0321 03:47:50.269979 6514 ovnkube.go:599] Stopped ovnkube\\\\nI0321 03:47:50.269983 6514 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0321 03:47:50.270014 6514 metrics.go:553] Stopping metrics server at address \\\\\\\"\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1017f2ba3943e2b7de1a63f4eebf7771e232b07b86bd3208e81bd5d193b31d25\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T03:47:53Z\\\",\\\"message\\\":\\\"troller_pods.go:477] [default/openshift-network-diagnostics/network-check-source-55646444c4-trplf] creating logical port openshift-network-diagnostics_network-check-source-55646444c4-trplf for pod on switch crc\\\\nI0321 03:47:53.579415 6704 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI0321 03:47:53.579424 6704 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI0321 03:47:53.579430 6704 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0321 03:47:53.579414 6704 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0321 03:47:53.579473 6704 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jrr5c\\\\nI0321 03:47:53.579505 6704 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jrr5c\\\\nF0321 03:47:53.579530 6704 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86328f182f7db62c6133733a4a4b9171f479c5b134c11250bb1839234544f485\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cpfzk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:54Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:54 crc kubenswrapper[4685]: I0321 03:47:54.248040 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mlsb2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b79f01f-bf05-4f7d-b816-6ef01f21e949\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://415d35423a66454f5ed06ba8facb7adbda91fae370e4d1d8a022cf7002e7b8bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6tf6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mlsb2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:54Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:54 crc kubenswrapper[4685]: I0321 03:47:54.266183 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df1f675def99930733ba3d2e467e3c07307a00cd8a10f0c3dee668de18f92dbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:54Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:54 crc kubenswrapper[4685]: I0321 03:47:54.272699 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:54 crc kubenswrapper[4685]: I0321 03:47:54.272732 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:54 crc kubenswrapper[4685]: I0321 03:47:54.272741 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:54 crc kubenswrapper[4685]: I0321 03:47:54.272755 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:54 crc kubenswrapper[4685]: I0321 03:47:54.272766 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:54Z","lastTransitionTime":"2026-03-21T03:47:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:54 crc kubenswrapper[4685]: I0321 03:47:54.283169 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:54Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:54 crc kubenswrapper[4685]: I0321 03:47:54.295641 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:54Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:54 crc kubenswrapper[4685]: I0321 03:47:54.308666 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-v9rdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fda9b1ff-e4a8-4d15-8f7b-2974991cd252\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrczq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrczq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-v9rdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:54Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:54 crc kubenswrapper[4685]: I0321 03:47:54.321395 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ztl6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dea34cf-d6c6-42fd-b4aa-8e175c6a78f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438ee4ce0e664eb23ee83b20eecbc81dd4f5ad12e50e98785be3696b58528952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sc2qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ztl6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:54Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:54 crc kubenswrapper[4685]: I0321 03:47:54.376117 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:54 crc kubenswrapper[4685]: I0321 03:47:54.376164 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:54 crc kubenswrapper[4685]: I0321 03:47:54.376180 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:54 crc kubenswrapper[4685]: I0321 03:47:54.376201 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:54 crc kubenswrapper[4685]: I0321 03:47:54.376218 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:54Z","lastTransitionTime":"2026-03-21T03:47:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:54 crc kubenswrapper[4685]: I0321 03:47:54.478441 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:54 crc kubenswrapper[4685]: I0321 03:47:54.478523 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:54 crc kubenswrapper[4685]: I0321 03:47:54.478545 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:54 crc kubenswrapper[4685]: I0321 03:47:54.478572 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:54 crc kubenswrapper[4685]: I0321 03:47:54.478591 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:54Z","lastTransitionTime":"2026-03-21T03:47:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:54 crc kubenswrapper[4685]: I0321 03:47:54.582160 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:54 crc kubenswrapper[4685]: I0321 03:47:54.582210 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:54 crc kubenswrapper[4685]: I0321 03:47:54.582219 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:54 crc kubenswrapper[4685]: I0321 03:47:54.582238 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:54 crc kubenswrapper[4685]: I0321 03:47:54.582249 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:54Z","lastTransitionTime":"2026-03-21T03:47:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:54 crc kubenswrapper[4685]: I0321 03:47:54.686197 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:54 crc kubenswrapper[4685]: I0321 03:47:54.686270 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:54 crc kubenswrapper[4685]: I0321 03:47:54.686294 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:54 crc kubenswrapper[4685]: I0321 03:47:54.686327 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:54 crc kubenswrapper[4685]: I0321 03:47:54.686351 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:54Z","lastTransitionTime":"2026-03-21T03:47:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:54 crc kubenswrapper[4685]: I0321 03:47:54.789122 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:54 crc kubenswrapper[4685]: I0321 03:47:54.789172 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:54 crc kubenswrapper[4685]: I0321 03:47:54.789184 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:54 crc kubenswrapper[4685]: I0321 03:47:54.789208 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:54 crc kubenswrapper[4685]: I0321 03:47:54.789222 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:54Z","lastTransitionTime":"2026-03-21T03:47:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:54 crc kubenswrapper[4685]: I0321 03:47:54.891749 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:54 crc kubenswrapper[4685]: I0321 03:47:54.891787 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:54 crc kubenswrapper[4685]: I0321 03:47:54.891799 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:54 crc kubenswrapper[4685]: I0321 03:47:54.891892 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:54 crc kubenswrapper[4685]: I0321 03:47:54.891921 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:54Z","lastTransitionTime":"2026-03-21T03:47:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:54 crc kubenswrapper[4685]: I0321 03:47:54.995208 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:54 crc kubenswrapper[4685]: I0321 03:47:54.995257 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:54 crc kubenswrapper[4685]: I0321 03:47:54.995266 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:54 crc kubenswrapper[4685]: I0321 03:47:54.995282 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:54 crc kubenswrapper[4685]: I0321 03:47:54.995308 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:54Z","lastTransitionTime":"2026-03-21T03:47:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:55 crc kubenswrapper[4685]: I0321 03:47:55.024999 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 03:47:55 crc kubenswrapper[4685]: E0321 03:47:55.025167 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 03:48:11.025140504 +0000 UTC m=+123.502209296 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:47:55 crc kubenswrapper[4685]: I0321 03:47:55.025264 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 03:47:55 crc kubenswrapper[4685]: E0321 03:47:55.025398 4685 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 21 03:47:55 crc kubenswrapper[4685]: E0321 03:47:55.025437 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 03:48:11.025430053 +0000 UTC m=+123.502498845 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 21 03:47:55 crc kubenswrapper[4685]: I0321 03:47:55.097567 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:55 crc kubenswrapper[4685]: I0321 03:47:55.097610 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:55 crc kubenswrapper[4685]: I0321 03:47:55.097619 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:55 crc kubenswrapper[4685]: I0321 03:47:55.097637 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:55 crc kubenswrapper[4685]: I0321 03:47:55.097648 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:55Z","lastTransitionTime":"2026-03-21T03:47:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:55 crc kubenswrapper[4685]: I0321 03:47:55.099803 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cpfzk_08dfc393-0ddb-4bde-9b1f-2a48549f4549/ovnkube-controller/1.log" Mar 21 03:47:55 crc kubenswrapper[4685]: I0321 03:47:55.105566 4685 scope.go:117] "RemoveContainer" containerID="1017f2ba3943e2b7de1a63f4eebf7771e232b07b86bd3208e81bd5d193b31d25" Mar 21 03:47:55 crc kubenswrapper[4685]: E0321 03:47:55.105892 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-cpfzk_openshift-ovn-kubernetes(08dfc393-0ddb-4bde-9b1f-2a48549f4549)\"" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" podUID="08dfc393-0ddb-4bde-9b1f-2a48549f4549" Mar 21 03:47:55 crc kubenswrapper[4685]: I0321 03:47:55.119117 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fd9d618-b4ed-4942-b915-76dc59fb834a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba587c4fe2f05966282b50ba5236b9f3d9ef6de63f72c70ae9f7a5222cb8b904\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60c96edd458d05f217a2e9f07a44bd221303d821a790382a82cff0b912d48f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8583e9b5dff82d5df52e281ba4069e9259b1c8fe3d1b8121d0e9f3f9e97d47b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c896022c1e0726ccc5a54da9c432a0ff4082158ad76abcea60c0a1427c69134b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c896022c1e0726ccc5a54da9c432a0ff4082158ad76abcea60c0a1427c69134b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T03:47:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 03:47:19.136464 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 03:47:19.136735 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 03:47:19.138007 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2287411523/tls.crt::/tmp/serving-cert-2287411523/tls.key\\\\\\\"\\\\nI0321 03:47:19.320330 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 03:47:19.322862 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 03:47:19.322879 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 03:47:19.322902 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 03:47:19.322907 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 03:47:19.326922 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0321 03:47:19.326936 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 03:47:19.326989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 03:47:19.326999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 03:47:19.327009 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0321 03:47:19.327019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 03:47:19.327030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 03:47:19.327039 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 03:47:19.328267 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:18Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://182ad351b6c632ea64087d4784ea919d5c21165dc5d0373fa35db7f7f1eea435\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52df5756c4d3e8d1087a5a38b3e2b9966fb492d666c43773aa12a4cd83f6158d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52df5756c4d3e8d1087a5a38b3e2b9966fb492d666c43773aa12a4cd83f6158d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:46:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:46:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:46:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:55Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:55 crc kubenswrapper[4685]: I0321 03:47:55.125717 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 03:47:55 crc kubenswrapper[4685]: I0321 03:47:55.125764 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 03:47:55 crc kubenswrapper[4685]: I0321 03:47:55.125796 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 03:47:55 crc kubenswrapper[4685]: I0321 03:47:55.125829 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fda9b1ff-e4a8-4d15-8f7b-2974991cd252-metrics-certs\") pod \"network-metrics-daemon-v9rdl\" (UID: \"fda9b1ff-e4a8-4d15-8f7b-2974991cd252\") " pod="openshift-multus/network-metrics-daemon-v9rdl" Mar 21 03:47:55 crc kubenswrapper[4685]: E0321 03:47:55.125955 4685 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 21 03:47:55 crc kubenswrapper[4685]: E0321 03:47:55.125998 4685 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 21 03:47:55 crc kubenswrapper[4685]: E0321 03:47:55.126014 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fda9b1ff-e4a8-4d15-8f7b-2974991cd252-metrics-certs podName:fda9b1ff-e4a8-4d15-8f7b-2974991cd252 nodeName:}" failed. No retries permitted until 2026-03-21 03:48:11.125996099 +0000 UTC m=+123.603064891 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fda9b1ff-e4a8-4d15-8f7b-2974991cd252-metrics-certs") pod "network-metrics-daemon-v9rdl" (UID: "fda9b1ff-e4a8-4d15-8f7b-2974991cd252") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 21 03:47:55 crc kubenswrapper[4685]: E0321 03:47:55.126112 4685 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 21 03:47:55 crc kubenswrapper[4685]: E0321 03:47:55.126146 4685 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 21 03:47:55 crc kubenswrapper[4685]: E0321 03:47:55.125955 4685 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 21 03:47:55 crc kubenswrapper[4685]: E0321 03:47:55.126166 4685 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 03:47:55 crc kubenswrapper[4685]: E0321 03:47:55.126161 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 03:48:11.126120083 +0000 UTC m=+123.603188905 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 21 03:47:55 crc kubenswrapper[4685]: E0321 03:47:55.126188 4685 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 21 03:47:55 crc kubenswrapper[4685]: E0321 03:47:55.126210 4685 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 03:47:55 crc kubenswrapper[4685]: E0321 03:47:55.126232 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-21 03:48:11.126219846 +0000 UTC m=+123.603288848 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 03:47:55 crc kubenswrapper[4685]: E0321 03:47:55.126267 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-21 03:48:11.126241876 +0000 UTC m=+123.603310678 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 03:47:55 crc kubenswrapper[4685]: I0321 03:47:55.132163 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cedbf5aa59177e67115d9dbf436805563022b6cebdd1c909d460d7997af5333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:55Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:55 crc kubenswrapper[4685]: I0321 03:47:55.150922 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08dfc393-0ddb-4bde-9b1f-2a48549f4549\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ad07f9e8bc73e15364d59e6baf04c54550518537bac8cbfcf180b423a1da4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://beb9850d14f3c89154f1960550eed4073643c6b70832bb3958176ac924f5a9fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34752fc66f7ecfa5c0a9a761bc6a69d21ab1fa332ecd79948cb6b1ea34a8a8b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2215b819a38926d6f34fede86a32762a942f00bc552ca9c0b8df5f17ed47c10a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d89981c8d56dfb581289dc794c8402f5fde866fd6948c2c0a1940eb3ba99a320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://473ea0e8600e10ea2a695b213d69795c56c1a399cbc5dc82ddb34f7261a034e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1017f2ba3943e2b7de1a63f4eebf7771e232b07b86bd3208e81bd5d193b31d25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1017f2ba3943e2b7de1a63f4eebf7771e232b07b86bd3208e81bd5d193b31d25\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T03:47:53Z\\\",\\\"message\\\":\\\"troller_pods.go:477] [default/openshift-network-diagnostics/network-check-source-55646444c4-trplf] creating logical port openshift-network-diagnostics_network-check-source-55646444c4-trplf for pod on switch crc\\\\nI0321 03:47:53.579415 6704 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI0321 03:47:53.579424 6704 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI0321 03:47:53.579430 6704 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0321 03:47:53.579414 6704 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0321 03:47:53.579473 6704 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jrr5c\\\\nI0321 03:47:53.579505 6704 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jrr5c\\\\nF0321 03:47:53.579530 6704 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-cpfzk_openshift-ovn-kubernetes(08dfc393-0ddb-4bde-9b1f-2a48549f4549)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86328f182f7db62c6133733a4a4b9171f479c5b134c11250bb1839234544f485\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cpfzk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:55Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:55 crc kubenswrapper[4685]: I0321 03:47:55.163066 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mlsb2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b79f01f-bf05-4f7d-b816-6ef01f21e949\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://415d35423a66454f5ed06ba8facb7adbda91fae370e4d1d8a022cf7002e7b8bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6tf6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mlsb2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:55Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:55 crc kubenswrapper[4685]: I0321 03:47:55.174947 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df1f675def99930733ba3d2e467e3c07307a00cd8a10f0c3dee668de18f92dbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:55Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:55 crc kubenswrapper[4685]: I0321 03:47:55.189644 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:55Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:55 crc kubenswrapper[4685]: I0321 03:47:55.200217 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:55 crc kubenswrapper[4685]: I0321 03:47:55.200285 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:55 crc kubenswrapper[4685]: I0321 03:47:55.200304 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:55 crc kubenswrapper[4685]: I0321 03:47:55.200331 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:55 crc kubenswrapper[4685]: I0321 03:47:55.200352 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:55Z","lastTransitionTime":"2026-03-21T03:47:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:55 crc kubenswrapper[4685]: I0321 03:47:55.207459 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:55Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:55 crc kubenswrapper[4685]: I0321 03:47:55.222092 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-v9rdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fda9b1ff-e4a8-4d15-8f7b-2974991cd252\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrczq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrczq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-v9rdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:55Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:55 crc kubenswrapper[4685]: I0321 03:47:55.233225 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ztl6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dea34cf-d6c6-42fd-b4aa-8e175c6a78f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438ee4ce0e664eb23ee83b20eecbc81dd4f5ad12e50e98785be3696b58528952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sc2qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ztl6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:55Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:55 crc kubenswrapper[4685]: I0321 03:47:55.248575 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:55Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:55 crc kubenswrapper[4685]: I0321 03:47:55.260695 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jrr5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2521a678-ad6c-464b-bf7b-c4f6237c2822\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://749ea865e46b19e2475b94b696bb2be8bc27add3f2ba2420c38b9d67b2b8ddd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc29c132abc0c440dd96bf14118de27e46aa41322e61ce3d685eb43962330cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jrr5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:55Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:55 crc kubenswrapper[4685]: I0321 03:47:55.274798 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aab80bc7857e9a3d86f540c725f4e35d0d86f979abed1ea5c9cddb4a6263924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d981cec0491154ace10ce4f437d1101c8dd69c34129a42b7dd5af226f8b14b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:55Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:55 crc kubenswrapper[4685]: I0321 03:47:55.291026 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6xvsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5cb18bc-bda7-463e-98fe-6d8ff293b949\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b3a945a5915df9df43b15e02b30e3ffa4b0f0dd4e9283d54f80b4adb56f368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d8efc9f2ad837ea5b63d39bb04316943c7562b645f8f78b12e54d430f2a4a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4d8efc9f2ad837ea5b63d39bb04316943c7562b645f8f78b12e54d430f2a4a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d2cd7978a2b2d5a571096a1f0172b76414dadc75fb54395e05d1f344f04e1a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d2cd7978a2b2d5a571096a1f0172b76414dadc75fb54395e05d1f344f04e1a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2188c1bcb129415bc4e9a76339bad4f973c3778a1c5a0735b15a8e0ccc17f5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2188c1bcb129415bc4e9a76339bad4f973c3778a1c5a0735b15a8e0ccc17f5ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1417d49afce3f61158d9158a6c9a55e186a1032fb3c4ad275c63b6c9df81dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1417d49afce3f61158d9158a6c9a55e186a1032fb3c4ad275c63b6c9df81dd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f6759044b25748d81e11825c925bc83b0caa70bc3bc5f930554ac6f8a19cabb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f6759044b25748d81e11825c925bc83b0caa70bc3bc5f930554ac6f8a19cabb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6eda6a197766562ad75d8ccf23a1e8998c02f7b6f2ac62fb7cc1ff8384b3d0cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6eda6a197766562ad75d8ccf23a1e8998c02f7b6f2ac62fb7cc1ff8384b3d0cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6xvsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:55Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:55 crc kubenswrapper[4685]: I0321 03:47:55.300127 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 03:47:55 crc kubenswrapper[4685]: I0321 03:47:55.300171 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v9rdl" Mar 21 03:47:55 crc kubenswrapper[4685]: I0321 03:47:55.300220 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 03:47:55 crc kubenswrapper[4685]: E0321 03:47:55.300268 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 03:47:55 crc kubenswrapper[4685]: I0321 03:47:55.300289 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 03:47:55 crc kubenswrapper[4685]: E0321 03:47:55.300363 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 03:47:55 crc kubenswrapper[4685]: E0321 03:47:55.300451 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 03:47:55 crc kubenswrapper[4685]: E0321 03:47:55.300509 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v9rdl" podUID="fda9b1ff-e4a8-4d15-8f7b-2974991cd252" Mar 21 03:47:55 crc kubenswrapper[4685]: I0321 03:47:55.303609 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:55 crc kubenswrapper[4685]: I0321 03:47:55.303658 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:55 crc kubenswrapper[4685]: I0321 03:47:55.303677 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:55 crc kubenswrapper[4685]: I0321 03:47:55.303707 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:55 crc kubenswrapper[4685]: I0321 03:47:55.303725 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:55Z","lastTransitionTime":"2026-03-21T03:47:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:55 crc kubenswrapper[4685]: I0321 03:47:55.306566 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7jcm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd9b1743-6b69-46d3-a429-6f83bf43317a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a947393ce78ff4c9dddd93eb5ee7cb034673ebb6f0391cb868b473028074aa46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x266z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7jcm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:55Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:55 crc kubenswrapper[4685]: I0321 03:47:55.321263 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cea46fe2-4e41-43ab-a069-cb30fb4e732c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f8af9aaf0940271c159c00b04ffb7a1dd9471caa93ad4c7bd4507461a095923\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5ntg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://682dda84970818843427fd441cf0359877cff12a6a10a7f7fad13d60c5ca62f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5ntg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7r9cg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:55Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:55 crc kubenswrapper[4685]: I0321 03:47:55.407616 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:55 crc kubenswrapper[4685]: I0321 03:47:55.407724 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:55 crc kubenswrapper[4685]: I0321 03:47:55.408236 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:55 crc kubenswrapper[4685]: I0321 03:47:55.408340 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:55 crc kubenswrapper[4685]: I0321 03:47:55.408629 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:55Z","lastTransitionTime":"2026-03-21T03:47:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:55 crc kubenswrapper[4685]: I0321 03:47:55.511654 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:55 crc kubenswrapper[4685]: I0321 03:47:55.511713 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:55 crc kubenswrapper[4685]: I0321 03:47:55.511728 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:55 crc kubenswrapper[4685]: I0321 03:47:55.511752 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:55 crc kubenswrapper[4685]: I0321 03:47:55.511765 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:55Z","lastTransitionTime":"2026-03-21T03:47:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:55 crc kubenswrapper[4685]: I0321 03:47:55.614055 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:55 crc kubenswrapper[4685]: I0321 03:47:55.614128 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:55 crc kubenswrapper[4685]: I0321 03:47:55.614163 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:55 crc kubenswrapper[4685]: I0321 03:47:55.614199 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:55 crc kubenswrapper[4685]: I0321 03:47:55.614227 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:55Z","lastTransitionTime":"2026-03-21T03:47:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:55 crc kubenswrapper[4685]: I0321 03:47:55.717065 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:55 crc kubenswrapper[4685]: I0321 03:47:55.717132 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:55 crc kubenswrapper[4685]: I0321 03:47:55.717144 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:55 crc kubenswrapper[4685]: I0321 03:47:55.717164 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:55 crc kubenswrapper[4685]: I0321 03:47:55.717179 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:55Z","lastTransitionTime":"2026-03-21T03:47:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:55 crc kubenswrapper[4685]: I0321 03:47:55.819892 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:55 crc kubenswrapper[4685]: I0321 03:47:55.819953 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:55 crc kubenswrapper[4685]: I0321 03:47:55.819973 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:55 crc kubenswrapper[4685]: I0321 03:47:55.820001 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:55 crc kubenswrapper[4685]: I0321 03:47:55.820032 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:55Z","lastTransitionTime":"2026-03-21T03:47:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:55 crc kubenswrapper[4685]: I0321 03:47:55.924464 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:55 crc kubenswrapper[4685]: I0321 03:47:55.924533 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:55 crc kubenswrapper[4685]: I0321 03:47:55.924554 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:55 crc kubenswrapper[4685]: I0321 03:47:55.924582 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:55 crc kubenswrapper[4685]: I0321 03:47:55.924601 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:55Z","lastTransitionTime":"2026-03-21T03:47:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:56 crc kubenswrapper[4685]: I0321 03:47:56.027958 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:56 crc kubenswrapper[4685]: I0321 03:47:56.028037 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:56 crc kubenswrapper[4685]: I0321 03:47:56.028052 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:56 crc kubenswrapper[4685]: I0321 03:47:56.028076 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:56 crc kubenswrapper[4685]: I0321 03:47:56.028096 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:56Z","lastTransitionTime":"2026-03-21T03:47:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:56 crc kubenswrapper[4685]: I0321 03:47:56.130466 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:56 crc kubenswrapper[4685]: I0321 03:47:56.130510 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:56 crc kubenswrapper[4685]: I0321 03:47:56.130521 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:56 crc kubenswrapper[4685]: I0321 03:47:56.130537 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:56 crc kubenswrapper[4685]: I0321 03:47:56.130549 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:56Z","lastTransitionTime":"2026-03-21T03:47:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:56 crc kubenswrapper[4685]: I0321 03:47:56.233471 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:56 crc kubenswrapper[4685]: I0321 03:47:56.233515 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:56 crc kubenswrapper[4685]: I0321 03:47:56.233526 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:56 crc kubenswrapper[4685]: I0321 03:47:56.233543 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:56 crc kubenswrapper[4685]: I0321 03:47:56.233555 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:56Z","lastTransitionTime":"2026-03-21T03:47:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:56 crc kubenswrapper[4685]: I0321 03:47:56.336635 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:56 crc kubenswrapper[4685]: I0321 03:47:56.336678 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:56 crc kubenswrapper[4685]: I0321 03:47:56.336691 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:56 crc kubenswrapper[4685]: I0321 03:47:56.336705 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:56 crc kubenswrapper[4685]: I0321 03:47:56.336718 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:56Z","lastTransitionTime":"2026-03-21T03:47:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:56 crc kubenswrapper[4685]: I0321 03:47:56.439580 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:56 crc kubenswrapper[4685]: I0321 03:47:56.439628 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:56 crc kubenswrapper[4685]: I0321 03:47:56.439640 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:56 crc kubenswrapper[4685]: I0321 03:47:56.439658 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:56 crc kubenswrapper[4685]: I0321 03:47:56.439671 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:56Z","lastTransitionTime":"2026-03-21T03:47:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:56 crc kubenswrapper[4685]: I0321 03:47:56.542144 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:56 crc kubenswrapper[4685]: I0321 03:47:56.542191 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:56 crc kubenswrapper[4685]: I0321 03:47:56.542204 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:56 crc kubenswrapper[4685]: I0321 03:47:56.542223 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:56 crc kubenswrapper[4685]: I0321 03:47:56.542238 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:56Z","lastTransitionTime":"2026-03-21T03:47:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:56 crc kubenswrapper[4685]: I0321 03:47:56.645193 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:56 crc kubenswrapper[4685]: I0321 03:47:56.645461 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:56 crc kubenswrapper[4685]: I0321 03:47:56.645469 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:56 crc kubenswrapper[4685]: I0321 03:47:56.645483 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:56 crc kubenswrapper[4685]: I0321 03:47:56.645493 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:56Z","lastTransitionTime":"2026-03-21T03:47:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:56 crc kubenswrapper[4685]: I0321 03:47:56.748076 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:56 crc kubenswrapper[4685]: I0321 03:47:56.748114 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:56 crc kubenswrapper[4685]: I0321 03:47:56.748124 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:56 crc kubenswrapper[4685]: I0321 03:47:56.748142 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:56 crc kubenswrapper[4685]: I0321 03:47:56.748153 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:56Z","lastTransitionTime":"2026-03-21T03:47:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:56 crc kubenswrapper[4685]: I0321 03:47:56.850370 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:56 crc kubenswrapper[4685]: I0321 03:47:56.850411 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:56 crc kubenswrapper[4685]: I0321 03:47:56.850422 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:56 crc kubenswrapper[4685]: I0321 03:47:56.850440 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:56 crc kubenswrapper[4685]: I0321 03:47:56.850451 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:56Z","lastTransitionTime":"2026-03-21T03:47:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:56 crc kubenswrapper[4685]: I0321 03:47:56.956239 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:56 crc kubenswrapper[4685]: I0321 03:47:56.956314 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:56 crc kubenswrapper[4685]: I0321 03:47:56.956331 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:56 crc kubenswrapper[4685]: I0321 03:47:56.956360 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:56 crc kubenswrapper[4685]: I0321 03:47:56.956379 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:56Z","lastTransitionTime":"2026-03-21T03:47:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:57 crc kubenswrapper[4685]: I0321 03:47:57.060109 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:57 crc kubenswrapper[4685]: I0321 03:47:57.060169 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:57 crc kubenswrapper[4685]: I0321 03:47:57.060182 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:57 crc kubenswrapper[4685]: I0321 03:47:57.060202 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:57 crc kubenswrapper[4685]: I0321 03:47:57.060218 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:57Z","lastTransitionTime":"2026-03-21T03:47:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:57 crc kubenswrapper[4685]: I0321 03:47:57.162989 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:57 crc kubenswrapper[4685]: I0321 03:47:57.163078 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:57 crc kubenswrapper[4685]: I0321 03:47:57.163096 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:57 crc kubenswrapper[4685]: I0321 03:47:57.163126 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:57 crc kubenswrapper[4685]: I0321 03:47:57.163147 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:57Z","lastTransitionTime":"2026-03-21T03:47:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:57 crc kubenswrapper[4685]: I0321 03:47:57.266413 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:57 crc kubenswrapper[4685]: I0321 03:47:57.266478 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:57 crc kubenswrapper[4685]: I0321 03:47:57.266496 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:57 crc kubenswrapper[4685]: I0321 03:47:57.266524 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:57 crc kubenswrapper[4685]: I0321 03:47:57.266544 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:57Z","lastTransitionTime":"2026-03-21T03:47:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:57 crc kubenswrapper[4685]: I0321 03:47:57.300201 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 03:47:57 crc kubenswrapper[4685]: I0321 03:47:57.300201 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 03:47:57 crc kubenswrapper[4685]: I0321 03:47:57.300358 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v9rdl" Mar 21 03:47:57 crc kubenswrapper[4685]: I0321 03:47:57.300202 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 03:47:57 crc kubenswrapper[4685]: E0321 03:47:57.300428 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v9rdl" podUID="fda9b1ff-e4a8-4d15-8f7b-2974991cd252" Mar 21 03:47:57 crc kubenswrapper[4685]: E0321 03:47:57.300319 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 03:47:57 crc kubenswrapper[4685]: E0321 03:47:57.300695 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 03:47:57 crc kubenswrapper[4685]: E0321 03:47:57.300800 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 03:47:57 crc kubenswrapper[4685]: I0321 03:47:57.369312 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:57 crc kubenswrapper[4685]: I0321 03:47:57.369379 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:57 crc kubenswrapper[4685]: I0321 03:47:57.369394 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:57 crc kubenswrapper[4685]: I0321 03:47:57.369415 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:57 crc kubenswrapper[4685]: I0321 03:47:57.369430 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:57Z","lastTransitionTime":"2026-03-21T03:47:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:57 crc kubenswrapper[4685]: I0321 03:47:57.472927 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:57 crc kubenswrapper[4685]: I0321 03:47:57.472994 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:57 crc kubenswrapper[4685]: I0321 03:47:57.473008 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:57 crc kubenswrapper[4685]: I0321 03:47:57.473032 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:57 crc kubenswrapper[4685]: I0321 03:47:57.473045 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:57Z","lastTransitionTime":"2026-03-21T03:47:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:57 crc kubenswrapper[4685]: I0321 03:47:57.576690 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:57 crc kubenswrapper[4685]: I0321 03:47:57.576761 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:57 crc kubenswrapper[4685]: I0321 03:47:57.576779 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:57 crc kubenswrapper[4685]: I0321 03:47:57.576807 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:57 crc kubenswrapper[4685]: I0321 03:47:57.576826 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:57Z","lastTransitionTime":"2026-03-21T03:47:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:57 crc kubenswrapper[4685]: I0321 03:47:57.680345 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:57 crc kubenswrapper[4685]: I0321 03:47:57.680397 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:57 crc kubenswrapper[4685]: I0321 03:47:57.680408 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:57 crc kubenswrapper[4685]: I0321 03:47:57.680429 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:57 crc kubenswrapper[4685]: I0321 03:47:57.680439 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:57Z","lastTransitionTime":"2026-03-21T03:47:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:57 crc kubenswrapper[4685]: I0321 03:47:57.784151 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:57 crc kubenswrapper[4685]: I0321 03:47:57.784230 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:57 crc kubenswrapper[4685]: I0321 03:47:57.784254 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:57 crc kubenswrapper[4685]: I0321 03:47:57.784359 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:57 crc kubenswrapper[4685]: I0321 03:47:57.784392 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:57Z","lastTransitionTime":"2026-03-21T03:47:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:57 crc kubenswrapper[4685]: I0321 03:47:57.886976 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:57 crc kubenswrapper[4685]: I0321 03:47:57.887041 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:57 crc kubenswrapper[4685]: I0321 03:47:57.887060 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:57 crc kubenswrapper[4685]: I0321 03:47:57.887088 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:57 crc kubenswrapper[4685]: I0321 03:47:57.887107 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:57Z","lastTransitionTime":"2026-03-21T03:47:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:57 crc kubenswrapper[4685]: I0321 03:47:57.990346 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:57 crc kubenswrapper[4685]: I0321 03:47:57.990416 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:57 crc kubenswrapper[4685]: I0321 03:47:57.990433 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:57 crc kubenswrapper[4685]: I0321 03:47:57.990464 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:57 crc kubenswrapper[4685]: I0321 03:47:57.990493 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:57Z","lastTransitionTime":"2026-03-21T03:47:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:58 crc kubenswrapper[4685]: I0321 03:47:58.095241 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:58 crc kubenswrapper[4685]: I0321 03:47:58.095328 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:58 crc kubenswrapper[4685]: I0321 03:47:58.095353 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:58 crc kubenswrapper[4685]: I0321 03:47:58.095382 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:58 crc kubenswrapper[4685]: I0321 03:47:58.095405 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:58Z","lastTransitionTime":"2026-03-21T03:47:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:58 crc kubenswrapper[4685]: I0321 03:47:58.199833 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:58 crc kubenswrapper[4685]: I0321 03:47:58.199957 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:58 crc kubenswrapper[4685]: I0321 03:47:58.199977 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:58 crc kubenswrapper[4685]: I0321 03:47:58.200007 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:58 crc kubenswrapper[4685]: I0321 03:47:58.200027 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:58Z","lastTransitionTime":"2026-03-21T03:47:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:58 crc kubenswrapper[4685]: I0321 03:47:58.303656 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:58 crc kubenswrapper[4685]: I0321 03:47:58.303724 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:58 crc kubenswrapper[4685]: I0321 03:47:58.303737 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:58 crc kubenswrapper[4685]: I0321 03:47:58.303760 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:58 crc kubenswrapper[4685]: I0321 03:47:58.303774 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:58Z","lastTransitionTime":"2026-03-21T03:47:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:58 crc kubenswrapper[4685]: I0321 03:47:58.328818 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fd9d618-b4ed-4942-b915-76dc59fb834a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba587c4fe2f05966282b50ba5236b9f3d9ef6de63f72c70ae9f7a5222cb8b904\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60c96edd458d05f217a2e9f07a44bd221303d821a790382a82cff0b912d48f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8583e9b5dff82d5df52e281ba4069e9259b1c8fe3d1b8121d0e9f3f9e97d47b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c896022c1e0726ccc5a54da9c432a0ff4082158ad76abcea60c0a1427c69134b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c896022c1e0726ccc5a54da9c432a0ff4082158ad76abcea60c0a1427c69134b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T03:47:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 03:47:19.136464 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 03:47:19.136735 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 03:47:19.138007 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2287411523/tls.crt::/tmp/serving-cert-2287411523/tls.key\\\\\\\"\\\\nI0321 03:47:19.320330 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 03:47:19.322862 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 03:47:19.322879 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 03:47:19.322902 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 03:47:19.322907 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 03:47:19.326922 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0321 03:47:19.326936 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 03:47:19.326989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 03:47:19.326999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 03:47:19.327009 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0321 03:47:19.327019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 03:47:19.327030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 03:47:19.327039 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 03:47:19.328267 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:18Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://182ad351b6c632ea64087d4784ea919d5c21165dc5d0373fa35db7f7f1eea435\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52df5756c4d3e8d1087a5a38b3e2b9966fb492d666c43773aa12a4cd83f6158d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52df5756c4d3e8d1087a5a38b3e2b9966fb492d666c43773aa12a4cd83f6158d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:46:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:46:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:46:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:58Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:58 crc kubenswrapper[4685]: I0321 03:47:58.347915 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cedbf5aa59177e67115d9dbf436805563022b6cebdd1c909d460d7997af5333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:58Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:58 crc kubenswrapper[4685]: I0321 03:47:58.373286 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df1f675def99930733ba3d2e467e3c07307a00cd8a10f0c3dee668de18f92dbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:58Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:58 crc kubenswrapper[4685]: I0321 03:47:58.396259 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:58Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:58 crc kubenswrapper[4685]: I0321 03:47:58.406773 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:58 crc kubenswrapper[4685]: I0321 03:47:58.406812 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:58 crc kubenswrapper[4685]: I0321 03:47:58.406828 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:58 crc kubenswrapper[4685]: I0321 03:47:58.406890 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:58 crc kubenswrapper[4685]: I0321 03:47:58.406908 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:58Z","lastTransitionTime":"2026-03-21T03:47:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:58 crc kubenswrapper[4685]: I0321 03:47:58.421020 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:58Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:58 crc kubenswrapper[4685]: I0321 03:47:58.441777 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-v9rdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fda9b1ff-e4a8-4d15-8f7b-2974991cd252\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrczq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrczq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-v9rdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:58Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:58 crc kubenswrapper[4685]: I0321 03:47:58.473329 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08dfc393-0ddb-4bde-9b1f-2a48549f4549\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ad07f9e8bc73e15364d59e6baf04c54550518537bac8cbfcf180b423a1da4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://beb9850d14f3c89154f1960550eed4073643c6b70832bb3958176ac924f5a9fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34752fc66f7ecfa5c0a9a761bc6a69d21ab1fa332ecd79948cb6b1ea34a8a8b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2215b819a38926d6f34fede86a32762a942f00bc552ca9c0b8df5f17ed47c10a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d89981c8d56dfb581289dc794c8402f5fde866fd6948c2c0a1940eb3ba99a320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://473ea0e8600e10ea2a695b213d69795c56c1a399cbc5dc82ddb34f7261a034e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1017f2ba3943e2b7de1a63f4eebf7771e232b07b86bd3208e81bd5d193b31d25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1017f2ba3943e2b7de1a63f4eebf7771e232b07b86bd3208e81bd5d193b31d25\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T03:47:53Z\\\",\\\"message\\\":\\\"troller_pods.go:477] [default/openshift-network-diagnostics/network-check-source-55646444c4-trplf] creating logical port openshift-network-diagnostics_network-check-source-55646444c4-trplf for pod on switch crc\\\\nI0321 03:47:53.579415 6704 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI0321 03:47:53.579424 6704 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI0321 03:47:53.579430 6704 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0321 03:47:53.579414 6704 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0321 03:47:53.579473 6704 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jrr5c\\\\nI0321 03:47:53.579505 6704 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jrr5c\\\\nF0321 03:47:53.579530 6704 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-cpfzk_openshift-ovn-kubernetes(08dfc393-0ddb-4bde-9b1f-2a48549f4549)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86328f182f7db62c6133733a4a4b9171f479c5b134c11250bb1839234544f485\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cpfzk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:58Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:58 crc kubenswrapper[4685]: I0321 03:47:58.491696 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mlsb2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b79f01f-bf05-4f7d-b816-6ef01f21e949\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://415d35423a66454f5ed06ba8facb7adbda91fae370e4d1d8a022cf7002e7b8bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6tf6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mlsb2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:58Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:58 crc kubenswrapper[4685]: I0321 03:47:58.507636 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ztl6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dea34cf-d6c6-42fd-b4aa-8e175c6a78f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438ee4ce0e664eb23ee83b20eecbc81dd4f5ad12e50e98785be3696b58528952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sc2qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ztl6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:58Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:58 crc kubenswrapper[4685]: I0321 03:47:58.509917 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:58 crc kubenswrapper[4685]: I0321 03:47:58.509947 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:58 crc kubenswrapper[4685]: I0321 03:47:58.509959 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:58 crc kubenswrapper[4685]: I0321 03:47:58.509978 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:58 crc kubenswrapper[4685]: I0321 03:47:58.509989 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:58Z","lastTransitionTime":"2026-03-21T03:47:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:58 crc kubenswrapper[4685]: I0321 03:47:58.530475 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:58Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:58 crc kubenswrapper[4685]: I0321 03:47:58.548319 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jrr5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2521a678-ad6c-464b-bf7b-c4f6237c2822\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://749ea865e46b19e2475b94b696bb2be8bc27add3f2ba2420c38b9d67b2b8ddd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc29c132abc0c440dd96bf14118de27e46aa41322e61ce3d685eb43962330cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jrr5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:58Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:58 crc kubenswrapper[4685]: I0321 03:47:58.568731 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6xvsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5cb18bc-bda7-463e-98fe-6d8ff293b949\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b3a945a5915df9df43b15e02b30e3ffa4b0f0dd4e9283d54f80b4adb56f368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d8efc9f2ad837ea5b63d39bb04316943c7562b645f8f78b12e54d430f2a4a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4d8efc9f2ad837ea5b63d39bb04316943c7562b645f8f78b12e54d430f2a4a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d2cd7978a2b2d5a571096a1f0172b76414dadc75fb54395e05d1f344f04e1a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d2cd7978a2b2d5a571096a1f0172b76414dadc75fb54395e05d1f344f04e1a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2188c1bcb129415bc4e9a76339bad4f973c3778a1c5a0735b15a8e0ccc17f5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2188c1bcb129415bc4e9a76339bad4f973c3778a1c5a0735b15a8e0ccc17f5ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1417d49afce3f61158d9158a6c9a55e186a1032fb3c4ad275c63b6c9df81dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1417d49afce3f61158d9158a6c9a55e186a1032fb3c4ad275c63b6c9df81dd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f6759044b25748d81e11825c925bc83b0caa70bc3bc5f930554ac6f8a19cabb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f6759044b25748d81e11825c925bc83b0caa70bc3bc5f930554ac6f8a19cabb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6eda6a197766562ad75d8ccf23a1e8998c02f7b6f2ac62fb7cc1ff8384b3d0cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6eda6a197766562ad75d8ccf23a1e8998c02f7b6f2ac62fb7cc1ff8384b3d0cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6xvsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:58Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:58 crc kubenswrapper[4685]: I0321 03:47:58.586794 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7jcm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd9b1743-6b69-46d3-a429-6f83bf43317a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a947393ce78ff4c9dddd93eb5ee7cb034673ebb6f0391cb868b473028074aa46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x266z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7jcm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:58Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:58 crc kubenswrapper[4685]: I0321 03:47:58.602540 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cea46fe2-4e41-43ab-a069-cb30fb4e732c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f8af9aaf0940271c159c00b04ffb7a1dd9471caa93ad4c7bd4507461a095923\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5ntg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://682dda84970818843427fd441cf0359877cff12a6a10a7f7fad13d60c5ca62f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5ntg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7r9cg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:58Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:58 crc kubenswrapper[4685]: I0321 03:47:58.613414 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:58 crc kubenswrapper[4685]: I0321 03:47:58.613647 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:58 crc kubenswrapper[4685]: I0321 03:47:58.613771 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:58 crc kubenswrapper[4685]: I0321 03:47:58.613924 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:58 crc kubenswrapper[4685]: I0321 03:47:58.614091 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:58Z","lastTransitionTime":"2026-03-21T03:47:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:58 crc kubenswrapper[4685]: I0321 03:47:58.623656 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aab80bc7857e9a3d86f540c725f4e35d0d86f979abed1ea5c9cddb4a6263924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d981cec0491154ace10ce4f437d1101c8dd69c34129a42b7dd5af226f8b14b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:47:58Z is after 2025-08-24T17:21:41Z" Mar 21 03:47:58 crc kubenswrapper[4685]: I0321 03:47:58.717205 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:58 crc kubenswrapper[4685]: I0321 03:47:58.717275 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:58 crc kubenswrapper[4685]: I0321 03:47:58.717297 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:58 crc kubenswrapper[4685]: I0321 03:47:58.717327 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:58 crc kubenswrapper[4685]: I0321 03:47:58.717348 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:58Z","lastTransitionTime":"2026-03-21T03:47:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:58 crc kubenswrapper[4685]: I0321 03:47:58.820133 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:58 crc kubenswrapper[4685]: I0321 03:47:58.820224 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:58 crc kubenswrapper[4685]: I0321 03:47:58.820249 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:58 crc kubenswrapper[4685]: I0321 03:47:58.820281 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:58 crc kubenswrapper[4685]: I0321 03:47:58.820305 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:58Z","lastTransitionTime":"2026-03-21T03:47:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:58 crc kubenswrapper[4685]: I0321 03:47:58.923609 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:58 crc kubenswrapper[4685]: I0321 03:47:58.923679 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:58 crc kubenswrapper[4685]: I0321 03:47:58.923696 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:58 crc kubenswrapper[4685]: I0321 03:47:58.923727 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:58 crc kubenswrapper[4685]: I0321 03:47:58.923749 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:58Z","lastTransitionTime":"2026-03-21T03:47:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:59 crc kubenswrapper[4685]: I0321 03:47:59.027681 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:59 crc kubenswrapper[4685]: I0321 03:47:59.027752 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:59 crc kubenswrapper[4685]: I0321 03:47:59.027771 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:59 crc kubenswrapper[4685]: I0321 03:47:59.027801 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:59 crc kubenswrapper[4685]: I0321 03:47:59.027827 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:59Z","lastTransitionTime":"2026-03-21T03:47:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:59 crc kubenswrapper[4685]: I0321 03:47:59.130671 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:59 crc kubenswrapper[4685]: I0321 03:47:59.130719 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:59 crc kubenswrapper[4685]: I0321 03:47:59.130735 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:59 crc kubenswrapper[4685]: I0321 03:47:59.130754 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:59 crc kubenswrapper[4685]: I0321 03:47:59.130769 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:59Z","lastTransitionTime":"2026-03-21T03:47:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:59 crc kubenswrapper[4685]: I0321 03:47:59.234778 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:59 crc kubenswrapper[4685]: I0321 03:47:59.234826 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:59 crc kubenswrapper[4685]: I0321 03:47:59.234855 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:59 crc kubenswrapper[4685]: I0321 03:47:59.234876 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:59 crc kubenswrapper[4685]: I0321 03:47:59.234890 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:59Z","lastTransitionTime":"2026-03-21T03:47:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:59 crc kubenswrapper[4685]: I0321 03:47:59.300028 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 03:47:59 crc kubenswrapper[4685]: I0321 03:47:59.300028 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 03:47:59 crc kubenswrapper[4685]: E0321 03:47:59.300582 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 03:47:59 crc kubenswrapper[4685]: I0321 03:47:59.300329 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v9rdl" Mar 21 03:47:59 crc kubenswrapper[4685]: I0321 03:47:59.300058 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 03:47:59 crc kubenswrapper[4685]: E0321 03:47:59.300819 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 03:47:59 crc kubenswrapper[4685]: E0321 03:47:59.301392 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 03:47:59 crc kubenswrapper[4685]: E0321 03:47:59.301644 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v9rdl" podUID="fda9b1ff-e4a8-4d15-8f7b-2974991cd252" Mar 21 03:47:59 crc kubenswrapper[4685]: I0321 03:47:59.337469 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:59 crc kubenswrapper[4685]: I0321 03:47:59.337524 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:59 crc kubenswrapper[4685]: I0321 03:47:59.337544 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:59 crc kubenswrapper[4685]: I0321 03:47:59.337571 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:59 crc kubenswrapper[4685]: I0321 03:47:59.337588 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:59Z","lastTransitionTime":"2026-03-21T03:47:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:59 crc kubenswrapper[4685]: I0321 03:47:59.440459 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:59 crc kubenswrapper[4685]: I0321 03:47:59.440514 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:59 crc kubenswrapper[4685]: I0321 03:47:59.440525 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:59 crc kubenswrapper[4685]: I0321 03:47:59.440545 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:59 crc kubenswrapper[4685]: I0321 03:47:59.440557 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:59Z","lastTransitionTime":"2026-03-21T03:47:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:59 crc kubenswrapper[4685]: I0321 03:47:59.542991 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:59 crc kubenswrapper[4685]: I0321 03:47:59.543038 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:59 crc kubenswrapper[4685]: I0321 03:47:59.543050 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:59 crc kubenswrapper[4685]: I0321 03:47:59.543071 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:59 crc kubenswrapper[4685]: I0321 03:47:59.543085 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:59Z","lastTransitionTime":"2026-03-21T03:47:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:59 crc kubenswrapper[4685]: I0321 03:47:59.645423 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:59 crc kubenswrapper[4685]: I0321 03:47:59.645461 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:59 crc kubenswrapper[4685]: I0321 03:47:59.645470 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:59 crc kubenswrapper[4685]: I0321 03:47:59.645484 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:59 crc kubenswrapper[4685]: I0321 03:47:59.645495 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:59Z","lastTransitionTime":"2026-03-21T03:47:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:59 crc kubenswrapper[4685]: I0321 03:47:59.748524 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:59 crc kubenswrapper[4685]: I0321 03:47:59.748868 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:59 crc kubenswrapper[4685]: I0321 03:47:59.748949 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:59 crc kubenswrapper[4685]: I0321 03:47:59.749030 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:59 crc kubenswrapper[4685]: I0321 03:47:59.749100 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:59Z","lastTransitionTime":"2026-03-21T03:47:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:59 crc kubenswrapper[4685]: I0321 03:47:59.851362 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:59 crc kubenswrapper[4685]: I0321 03:47:59.851432 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:59 crc kubenswrapper[4685]: I0321 03:47:59.851456 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:59 crc kubenswrapper[4685]: I0321 03:47:59.851491 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:59 crc kubenswrapper[4685]: I0321 03:47:59.851515 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:59Z","lastTransitionTime":"2026-03-21T03:47:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:47:59 crc kubenswrapper[4685]: I0321 03:47:59.954088 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:47:59 crc kubenswrapper[4685]: I0321 03:47:59.954169 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:47:59 crc kubenswrapper[4685]: I0321 03:47:59.954195 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:47:59 crc kubenswrapper[4685]: I0321 03:47:59.954228 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:47:59 crc kubenswrapper[4685]: I0321 03:47:59.954254 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:47:59Z","lastTransitionTime":"2026-03-21T03:47:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:00 crc kubenswrapper[4685]: I0321 03:48:00.055955 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:00 crc kubenswrapper[4685]: I0321 03:48:00.055999 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:00 crc kubenswrapper[4685]: I0321 03:48:00.056012 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:00 crc kubenswrapper[4685]: I0321 03:48:00.056031 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:00 crc kubenswrapper[4685]: I0321 03:48:00.056044 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:00Z","lastTransitionTime":"2026-03-21T03:48:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:00 crc kubenswrapper[4685]: I0321 03:48:00.147788 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:00 crc kubenswrapper[4685]: I0321 03:48:00.147823 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:00 crc kubenswrapper[4685]: I0321 03:48:00.147848 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:00 crc kubenswrapper[4685]: I0321 03:48:00.147866 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:00 crc kubenswrapper[4685]: I0321 03:48:00.147878 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:00Z","lastTransitionTime":"2026-03-21T03:48:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:00 crc kubenswrapper[4685]: E0321 03:48:00.159450 4685 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8bd8455e-cd7f-4a01-9ab2-39696fd22c82\\\",\\\"systemUUID\\\":\\\"9e08e2bc-a5ba-4ecf-a5ec-5e30e6b4a1fa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:00Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:00 crc kubenswrapper[4685]: I0321 03:48:00.162608 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:00 crc kubenswrapper[4685]: I0321 03:48:00.162742 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:00 crc kubenswrapper[4685]: I0321 03:48:00.162827 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:00 crc kubenswrapper[4685]: I0321 03:48:00.162911 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:00 crc kubenswrapper[4685]: I0321 03:48:00.162977 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:00Z","lastTransitionTime":"2026-03-21T03:48:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:00 crc kubenswrapper[4685]: E0321 03:48:00.174374 4685 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8bd8455e-cd7f-4a01-9ab2-39696fd22c82\\\",\\\"systemUUID\\\":\\\"9e08e2bc-a5ba-4ecf-a5ec-5e30e6b4a1fa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:00Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:00 crc kubenswrapper[4685]: I0321 03:48:00.177787 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:00 crc kubenswrapper[4685]: I0321 03:48:00.177814 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:00 crc kubenswrapper[4685]: I0321 03:48:00.177825 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:00 crc kubenswrapper[4685]: I0321 03:48:00.177871 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:00 crc kubenswrapper[4685]: I0321 03:48:00.177883 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:00Z","lastTransitionTime":"2026-03-21T03:48:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:00 crc kubenswrapper[4685]: E0321 03:48:00.193737 4685 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8bd8455e-cd7f-4a01-9ab2-39696fd22c82\\\",\\\"systemUUID\\\":\\\"9e08e2bc-a5ba-4ecf-a5ec-5e30e6b4a1fa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:00Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:00 crc kubenswrapper[4685]: I0321 03:48:00.197152 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:00 crc kubenswrapper[4685]: I0321 03:48:00.197199 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:00 crc kubenswrapper[4685]: I0321 03:48:00.197216 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:00 crc kubenswrapper[4685]: I0321 03:48:00.197237 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:00 crc kubenswrapper[4685]: I0321 03:48:00.197253 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:00Z","lastTransitionTime":"2026-03-21T03:48:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:00 crc kubenswrapper[4685]: E0321 03:48:00.215092 4685 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8bd8455e-cd7f-4a01-9ab2-39696fd22c82\\\",\\\"systemUUID\\\":\\\"9e08e2bc-a5ba-4ecf-a5ec-5e30e6b4a1fa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:00Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:00 crc kubenswrapper[4685]: I0321 03:48:00.219704 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:00 crc kubenswrapper[4685]: I0321 03:48:00.219744 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:00 crc kubenswrapper[4685]: I0321 03:48:00.219757 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:00 crc kubenswrapper[4685]: I0321 03:48:00.219773 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:00 crc kubenswrapper[4685]: I0321 03:48:00.219783 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:00Z","lastTransitionTime":"2026-03-21T03:48:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:00 crc kubenswrapper[4685]: E0321 03:48:00.235533 4685 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8bd8455e-cd7f-4a01-9ab2-39696fd22c82\\\",\\\"systemUUID\\\":\\\"9e08e2bc-a5ba-4ecf-a5ec-5e30e6b4a1fa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:00Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:00 crc kubenswrapper[4685]: E0321 03:48:00.235707 4685 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 21 03:48:00 crc kubenswrapper[4685]: I0321 03:48:00.238308 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:00 crc kubenswrapper[4685]: I0321 03:48:00.238334 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:00 crc kubenswrapper[4685]: I0321 03:48:00.238346 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:00 crc kubenswrapper[4685]: I0321 03:48:00.238364 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:00 crc kubenswrapper[4685]: I0321 03:48:00.238375 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:00Z","lastTransitionTime":"2026-03-21T03:48:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:00 crc kubenswrapper[4685]: I0321 03:48:00.341282 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:00 crc kubenswrapper[4685]: I0321 03:48:00.341350 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:00 crc kubenswrapper[4685]: I0321 03:48:00.341369 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:00 crc kubenswrapper[4685]: I0321 03:48:00.341399 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:00 crc kubenswrapper[4685]: I0321 03:48:00.341420 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:00Z","lastTransitionTime":"2026-03-21T03:48:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:00 crc kubenswrapper[4685]: I0321 03:48:00.444810 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:00 crc kubenswrapper[4685]: I0321 03:48:00.444874 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:00 crc kubenswrapper[4685]: I0321 03:48:00.444885 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:00 crc kubenswrapper[4685]: I0321 03:48:00.444900 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:00 crc kubenswrapper[4685]: I0321 03:48:00.444911 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:00Z","lastTransitionTime":"2026-03-21T03:48:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:00 crc kubenswrapper[4685]: I0321 03:48:00.548260 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:00 crc kubenswrapper[4685]: I0321 03:48:00.548737 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:00 crc kubenswrapper[4685]: I0321 03:48:00.548927 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:00 crc kubenswrapper[4685]: I0321 03:48:00.549119 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:00 crc kubenswrapper[4685]: I0321 03:48:00.549259 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:00Z","lastTransitionTime":"2026-03-21T03:48:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:00 crc kubenswrapper[4685]: I0321 03:48:00.652738 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:00 crc kubenswrapper[4685]: I0321 03:48:00.652788 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:00 crc kubenswrapper[4685]: I0321 03:48:00.652800 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:00 crc kubenswrapper[4685]: I0321 03:48:00.652821 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:00 crc kubenswrapper[4685]: I0321 03:48:00.652831 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:00Z","lastTransitionTime":"2026-03-21T03:48:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:00 crc kubenswrapper[4685]: I0321 03:48:00.755709 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:00 crc kubenswrapper[4685]: I0321 03:48:00.755756 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:00 crc kubenswrapper[4685]: I0321 03:48:00.755765 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:00 crc kubenswrapper[4685]: I0321 03:48:00.755779 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:00 crc kubenswrapper[4685]: I0321 03:48:00.755789 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:00Z","lastTransitionTime":"2026-03-21T03:48:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:00 crc kubenswrapper[4685]: I0321 03:48:00.858555 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:00 crc kubenswrapper[4685]: I0321 03:48:00.858608 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:00 crc kubenswrapper[4685]: I0321 03:48:00.858620 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:00 crc kubenswrapper[4685]: I0321 03:48:00.858638 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:00 crc kubenswrapper[4685]: I0321 03:48:00.858650 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:00Z","lastTransitionTime":"2026-03-21T03:48:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:00 crc kubenswrapper[4685]: I0321 03:48:00.961693 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:00 crc kubenswrapper[4685]: I0321 03:48:00.961755 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:00 crc kubenswrapper[4685]: I0321 03:48:00.961767 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:00 crc kubenswrapper[4685]: I0321 03:48:00.961787 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:00 crc kubenswrapper[4685]: I0321 03:48:00.961805 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:00Z","lastTransitionTime":"2026-03-21T03:48:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:01 crc kubenswrapper[4685]: I0321 03:48:01.064790 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:01 crc kubenswrapper[4685]: I0321 03:48:01.064907 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:01 crc kubenswrapper[4685]: I0321 03:48:01.064927 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:01 crc kubenswrapper[4685]: I0321 03:48:01.064957 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:01 crc kubenswrapper[4685]: I0321 03:48:01.064978 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:01Z","lastTransitionTime":"2026-03-21T03:48:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:01 crc kubenswrapper[4685]: I0321 03:48:01.167581 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:01 crc kubenswrapper[4685]: I0321 03:48:01.167699 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:01 crc kubenswrapper[4685]: I0321 03:48:01.167716 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:01 crc kubenswrapper[4685]: I0321 03:48:01.167743 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:01 crc kubenswrapper[4685]: I0321 03:48:01.167763 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:01Z","lastTransitionTime":"2026-03-21T03:48:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:01 crc kubenswrapper[4685]: I0321 03:48:01.270901 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:01 crc kubenswrapper[4685]: I0321 03:48:01.270955 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:01 crc kubenswrapper[4685]: I0321 03:48:01.270966 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:01 crc kubenswrapper[4685]: I0321 03:48:01.270985 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:01 crc kubenswrapper[4685]: I0321 03:48:01.270997 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:01Z","lastTransitionTime":"2026-03-21T03:48:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:01 crc kubenswrapper[4685]: I0321 03:48:01.300888 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v9rdl" Mar 21 03:48:01 crc kubenswrapper[4685]: I0321 03:48:01.300886 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 03:48:01 crc kubenswrapper[4685]: E0321 03:48:01.301099 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v9rdl" podUID="fda9b1ff-e4a8-4d15-8f7b-2974991cd252" Mar 21 03:48:01 crc kubenswrapper[4685]: I0321 03:48:01.300916 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 03:48:01 crc kubenswrapper[4685]: I0321 03:48:01.300911 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 03:48:01 crc kubenswrapper[4685]: E0321 03:48:01.301177 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 03:48:01 crc kubenswrapper[4685]: E0321 03:48:01.301225 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 03:48:01 crc kubenswrapper[4685]: E0321 03:48:01.301382 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 03:48:01 crc kubenswrapper[4685]: I0321 03:48:01.381590 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:01 crc kubenswrapper[4685]: I0321 03:48:01.381657 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:01 crc kubenswrapper[4685]: I0321 03:48:01.381676 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:01 crc kubenswrapper[4685]: I0321 03:48:01.381702 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:01 crc kubenswrapper[4685]: I0321 03:48:01.381721 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:01Z","lastTransitionTime":"2026-03-21T03:48:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:01 crc kubenswrapper[4685]: I0321 03:48:01.484643 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:01 crc kubenswrapper[4685]: I0321 03:48:01.484706 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:01 crc kubenswrapper[4685]: I0321 03:48:01.484725 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:01 crc kubenswrapper[4685]: I0321 03:48:01.484750 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:01 crc kubenswrapper[4685]: I0321 03:48:01.484768 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:01Z","lastTransitionTime":"2026-03-21T03:48:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:01 crc kubenswrapper[4685]: I0321 03:48:01.588795 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:01 crc kubenswrapper[4685]: I0321 03:48:01.588890 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:01 crc kubenswrapper[4685]: I0321 03:48:01.588909 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:01 crc kubenswrapper[4685]: I0321 03:48:01.588943 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:01 crc kubenswrapper[4685]: I0321 03:48:01.588963 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:01Z","lastTransitionTime":"2026-03-21T03:48:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:01 crc kubenswrapper[4685]: I0321 03:48:01.691724 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:01 crc kubenswrapper[4685]: I0321 03:48:01.691775 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:01 crc kubenswrapper[4685]: I0321 03:48:01.691791 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:01 crc kubenswrapper[4685]: I0321 03:48:01.691816 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:01 crc kubenswrapper[4685]: I0321 03:48:01.691856 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:01Z","lastTransitionTime":"2026-03-21T03:48:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:01 crc kubenswrapper[4685]: I0321 03:48:01.794940 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:01 crc kubenswrapper[4685]: I0321 03:48:01.795019 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:01 crc kubenswrapper[4685]: I0321 03:48:01.795042 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:01 crc kubenswrapper[4685]: I0321 03:48:01.795078 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:01 crc kubenswrapper[4685]: I0321 03:48:01.795103 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:01Z","lastTransitionTime":"2026-03-21T03:48:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:01 crc kubenswrapper[4685]: I0321 03:48:01.898665 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:01 crc kubenswrapper[4685]: I0321 03:48:01.898732 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:01 crc kubenswrapper[4685]: I0321 03:48:01.898749 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:01 crc kubenswrapper[4685]: I0321 03:48:01.898779 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:01 crc kubenswrapper[4685]: I0321 03:48:01.898800 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:01Z","lastTransitionTime":"2026-03-21T03:48:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:02 crc kubenswrapper[4685]: I0321 03:48:02.001939 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:02 crc kubenswrapper[4685]: I0321 03:48:02.002004 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:02 crc kubenswrapper[4685]: I0321 03:48:02.002028 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:02 crc kubenswrapper[4685]: I0321 03:48:02.002123 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:02 crc kubenswrapper[4685]: I0321 03:48:02.002152 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:02Z","lastTransitionTime":"2026-03-21T03:48:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:02 crc kubenswrapper[4685]: I0321 03:48:02.105875 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:02 crc kubenswrapper[4685]: I0321 03:48:02.105939 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:02 crc kubenswrapper[4685]: I0321 03:48:02.105955 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:02 crc kubenswrapper[4685]: I0321 03:48:02.105987 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:02 crc kubenswrapper[4685]: I0321 03:48:02.106012 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:02Z","lastTransitionTime":"2026-03-21T03:48:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:02 crc kubenswrapper[4685]: I0321 03:48:02.209090 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:02 crc kubenswrapper[4685]: I0321 03:48:02.209156 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:02 crc kubenswrapper[4685]: I0321 03:48:02.209176 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:02 crc kubenswrapper[4685]: I0321 03:48:02.209210 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:02 crc kubenswrapper[4685]: I0321 03:48:02.209233 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:02Z","lastTransitionTime":"2026-03-21T03:48:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:02 crc kubenswrapper[4685]: I0321 03:48:02.345728 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:02 crc kubenswrapper[4685]: I0321 03:48:02.345777 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:02 crc kubenswrapper[4685]: I0321 03:48:02.345792 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:02 crc kubenswrapper[4685]: I0321 03:48:02.345824 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:02 crc kubenswrapper[4685]: I0321 03:48:02.345862 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:02Z","lastTransitionTime":"2026-03-21T03:48:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:02 crc kubenswrapper[4685]: I0321 03:48:02.448134 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:02 crc kubenswrapper[4685]: I0321 03:48:02.448200 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:02 crc kubenswrapper[4685]: I0321 03:48:02.448221 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:02 crc kubenswrapper[4685]: I0321 03:48:02.448245 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:02 crc kubenswrapper[4685]: I0321 03:48:02.448261 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:02Z","lastTransitionTime":"2026-03-21T03:48:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:02 crc kubenswrapper[4685]: I0321 03:48:02.551167 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:02 crc kubenswrapper[4685]: I0321 03:48:02.551228 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:02 crc kubenswrapper[4685]: I0321 03:48:02.551246 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:02 crc kubenswrapper[4685]: I0321 03:48:02.551270 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:02 crc kubenswrapper[4685]: I0321 03:48:02.551288 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:02Z","lastTransitionTime":"2026-03-21T03:48:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:02 crc kubenswrapper[4685]: I0321 03:48:02.658913 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:02 crc kubenswrapper[4685]: I0321 03:48:02.658973 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:02 crc kubenswrapper[4685]: I0321 03:48:02.658985 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:02 crc kubenswrapper[4685]: I0321 03:48:02.659007 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:02 crc kubenswrapper[4685]: I0321 03:48:02.659027 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:02Z","lastTransitionTime":"2026-03-21T03:48:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:02 crc kubenswrapper[4685]: I0321 03:48:02.762551 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:02 crc kubenswrapper[4685]: I0321 03:48:02.762651 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:02 crc kubenswrapper[4685]: I0321 03:48:02.762669 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:02 crc kubenswrapper[4685]: I0321 03:48:02.762696 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:02 crc kubenswrapper[4685]: I0321 03:48:02.762717 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:02Z","lastTransitionTime":"2026-03-21T03:48:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:02 crc kubenswrapper[4685]: I0321 03:48:02.865700 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:02 crc kubenswrapper[4685]: I0321 03:48:02.865751 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:02 crc kubenswrapper[4685]: I0321 03:48:02.865761 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:02 crc kubenswrapper[4685]: I0321 03:48:02.865782 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:02 crc kubenswrapper[4685]: I0321 03:48:02.865791 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:02Z","lastTransitionTime":"2026-03-21T03:48:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:02 crc kubenswrapper[4685]: I0321 03:48:02.969316 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:02 crc kubenswrapper[4685]: I0321 03:48:02.969375 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:02 crc kubenswrapper[4685]: I0321 03:48:02.969392 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:02 crc kubenswrapper[4685]: I0321 03:48:02.969418 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:02 crc kubenswrapper[4685]: I0321 03:48:02.969438 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:02Z","lastTransitionTime":"2026-03-21T03:48:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:03 crc kubenswrapper[4685]: I0321 03:48:03.071917 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:03 crc kubenswrapper[4685]: I0321 03:48:03.072003 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:03 crc kubenswrapper[4685]: I0321 03:48:03.072027 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:03 crc kubenswrapper[4685]: I0321 03:48:03.072162 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:03 crc kubenswrapper[4685]: I0321 03:48:03.072192 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:03Z","lastTransitionTime":"2026-03-21T03:48:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:03 crc kubenswrapper[4685]: I0321 03:48:03.174749 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:03 crc kubenswrapper[4685]: I0321 03:48:03.174829 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:03 crc kubenswrapper[4685]: I0321 03:48:03.174888 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:03 crc kubenswrapper[4685]: I0321 03:48:03.174920 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:03 crc kubenswrapper[4685]: I0321 03:48:03.174943 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:03Z","lastTransitionTime":"2026-03-21T03:48:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:03 crc kubenswrapper[4685]: I0321 03:48:03.277660 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:03 crc kubenswrapper[4685]: I0321 03:48:03.277724 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:03 crc kubenswrapper[4685]: I0321 03:48:03.277747 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:03 crc kubenswrapper[4685]: I0321 03:48:03.277783 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:03 crc kubenswrapper[4685]: I0321 03:48:03.277806 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:03Z","lastTransitionTime":"2026-03-21T03:48:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:03 crc kubenswrapper[4685]: I0321 03:48:03.300639 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 03:48:03 crc kubenswrapper[4685]: I0321 03:48:03.300664 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 03:48:03 crc kubenswrapper[4685]: I0321 03:48:03.300737 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 03:48:03 crc kubenswrapper[4685]: I0321 03:48:03.300809 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v9rdl" Mar 21 03:48:03 crc kubenswrapper[4685]: E0321 03:48:03.301303 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 03:48:03 crc kubenswrapper[4685]: E0321 03:48:03.301465 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 03:48:03 crc kubenswrapper[4685]: E0321 03:48:03.301601 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 03:48:03 crc kubenswrapper[4685]: I0321 03:48:03.301724 4685 scope.go:117] "RemoveContainer" containerID="c896022c1e0726ccc5a54da9c432a0ff4082158ad76abcea60c0a1427c69134b" Mar 21 03:48:03 crc kubenswrapper[4685]: E0321 03:48:03.301739 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v9rdl" podUID="fda9b1ff-e4a8-4d15-8f7b-2974991cd252" Mar 21 03:48:03 crc kubenswrapper[4685]: I0321 03:48:03.381178 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:03 crc kubenswrapper[4685]: I0321 03:48:03.381279 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:03 crc kubenswrapper[4685]: I0321 03:48:03.381298 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:03 crc kubenswrapper[4685]: I0321 03:48:03.381359 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:03 crc kubenswrapper[4685]: I0321 03:48:03.381380 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:03Z","lastTransitionTime":"2026-03-21T03:48:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:03 crc kubenswrapper[4685]: I0321 03:48:03.485158 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:03 crc kubenswrapper[4685]: I0321 03:48:03.485197 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:03 crc kubenswrapper[4685]: I0321 03:48:03.485205 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:03 crc kubenswrapper[4685]: I0321 03:48:03.485240 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:03 crc kubenswrapper[4685]: I0321 03:48:03.485253 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:03Z","lastTransitionTime":"2026-03-21T03:48:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:03 crc kubenswrapper[4685]: I0321 03:48:03.587606 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:03 crc kubenswrapper[4685]: I0321 03:48:03.587665 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:03 crc kubenswrapper[4685]: I0321 03:48:03.587682 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:03 crc kubenswrapper[4685]: I0321 03:48:03.587708 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:03 crc kubenswrapper[4685]: I0321 03:48:03.587732 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:03Z","lastTransitionTime":"2026-03-21T03:48:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:03 crc kubenswrapper[4685]: I0321 03:48:03.690907 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:03 crc kubenswrapper[4685]: I0321 03:48:03.690975 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:03 crc kubenswrapper[4685]: I0321 03:48:03.690991 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:03 crc kubenswrapper[4685]: I0321 03:48:03.691017 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:03 crc kubenswrapper[4685]: I0321 03:48:03.691035 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:03Z","lastTransitionTime":"2026-03-21T03:48:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:03 crc kubenswrapper[4685]: I0321 03:48:03.793825 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:03 crc kubenswrapper[4685]: I0321 03:48:03.793945 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:03 crc kubenswrapper[4685]: I0321 03:48:03.793969 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:03 crc kubenswrapper[4685]: I0321 03:48:03.793997 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:03 crc kubenswrapper[4685]: I0321 03:48:03.794017 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:03Z","lastTransitionTime":"2026-03-21T03:48:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:03 crc kubenswrapper[4685]: I0321 03:48:03.897032 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:03 crc kubenswrapper[4685]: I0321 03:48:03.897323 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:03 crc kubenswrapper[4685]: I0321 03:48:03.897391 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:03 crc kubenswrapper[4685]: I0321 03:48:03.897460 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:03 crc kubenswrapper[4685]: I0321 03:48:03.897519 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:03Z","lastTransitionTime":"2026-03-21T03:48:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:04 crc kubenswrapper[4685]: I0321 03:48:04.000160 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:04 crc kubenswrapper[4685]: I0321 03:48:04.000242 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:04 crc kubenswrapper[4685]: I0321 03:48:04.000261 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:04 crc kubenswrapper[4685]: I0321 03:48:04.000290 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:04 crc kubenswrapper[4685]: I0321 03:48:04.000309 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:04Z","lastTransitionTime":"2026-03-21T03:48:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:04 crc kubenswrapper[4685]: I0321 03:48:04.103648 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:04 crc kubenswrapper[4685]: I0321 03:48:04.103707 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:04 crc kubenswrapper[4685]: I0321 03:48:04.103718 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:04 crc kubenswrapper[4685]: I0321 03:48:04.103737 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:04 crc kubenswrapper[4685]: I0321 03:48:04.103750 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:04Z","lastTransitionTime":"2026-03-21T03:48:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:04 crc kubenswrapper[4685]: I0321 03:48:04.151381 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 21 03:48:04 crc kubenswrapper[4685]: I0321 03:48:04.152899 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"cf0ba4af6f1f48fcb9ccf07dec53dc3ff1835a83dbf535bc48feb68fd646e78f"} Mar 21 03:48:04 crc kubenswrapper[4685]: I0321 03:48:04.154062 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 03:48:04 crc kubenswrapper[4685]: I0321 03:48:04.177924 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fd9d618-b4ed-4942-b915-76dc59fb834a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba587c4fe2f05966282b50ba5236b9f3d9ef6de63f72c70ae9f7a5222cb8b904\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60c96edd458d05f217a2e9f07a44bd221303d821a790382a82cff0b912d48f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8583e9b5dff82d5df52e281ba4069e9259b1c8fe3d1b8121d0e9f3f9e97d47b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf0ba4af6f1f48fcb9ccf07dec53dc3ff1835a83dbf535bc48feb68fd646e78f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c896022c1e0726ccc5a54da9c432a0ff4082158ad76abcea60c0a1427c69134b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T03:47:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 03:47:19.136464 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 03:47:19.136735 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 03:47:19.138007 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2287411523/tls.crt::/tmp/serving-cert-2287411523/tls.key\\\\\\\"\\\\nI0321 03:47:19.320330 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 03:47:19.322862 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 03:47:19.322879 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 03:47:19.322902 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 03:47:19.322907 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 03:47:19.326922 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0321 03:47:19.326936 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 03:47:19.326989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 03:47:19.326999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 03:47:19.327009 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0321 03:47:19.327019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 03:47:19.327030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 03:47:19.327039 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 03:47:19.328267 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:18Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:48:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://182ad351b6c632ea64087d4784ea919d5c21165dc5d0373fa35db7f7f1eea435\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52df5756c4d3e8d1087a5a38b3e2b9966fb492d666c43773aa12a4cd83f6158d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52df5756c4d3e8d1087a5a38b3e2b9966fb492d666c43773aa12a4cd83f6158d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:46:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:46:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:46:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:04Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:04 crc kubenswrapper[4685]: I0321 03:48:04.196886 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cedbf5aa59177e67115d9dbf436805563022b6cebdd1c909d460d7997af5333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:04Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:04 crc kubenswrapper[4685]: I0321 03:48:04.206438 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:04 crc kubenswrapper[4685]: I0321 03:48:04.206470 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:04 crc kubenswrapper[4685]: I0321 03:48:04.206481 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:04 crc kubenswrapper[4685]: I0321 03:48:04.206498 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:04 crc kubenswrapper[4685]: I0321 03:48:04.206512 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:04Z","lastTransitionTime":"2026-03-21T03:48:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:04 crc kubenswrapper[4685]: I0321 03:48:04.214303 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:04Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:04 crc kubenswrapper[4685]: I0321 03:48:04.229918 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-v9rdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fda9b1ff-e4a8-4d15-8f7b-2974991cd252\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrczq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrczq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-v9rdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:04Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:04 crc kubenswrapper[4685]: I0321 03:48:04.260151 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08dfc393-0ddb-4bde-9b1f-2a48549f4549\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ad07f9e8bc73e15364d59e6baf04c54550518537bac8cbfcf180b423a1da4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://beb9850d14f3c89154f1960550eed4073643c6b70832bb3958176ac924f5a9fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34752fc66f7ecfa5c0a9a761bc6a69d21ab1fa332ecd79948cb6b1ea34a8a8b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2215b819a38926d6f34fede86a32762a942f00bc552ca9c0b8df5f17ed47c10a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d89981c8d56dfb581289dc794c8402f5fde866fd6948c2c0a1940eb3ba99a320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://473ea0e8600e10ea2a695b213d69795c56c1a399cbc5dc82ddb34f7261a034e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1017f2ba3943e2b7de1a63f4eebf7771e232b07b86bd3208e81bd5d193b31d25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1017f2ba3943e2b7de1a63f4eebf7771e232b07b86bd3208e81bd5d193b31d25\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T03:47:53Z\\\",\\\"message\\\":\\\"troller_pods.go:477] [default/openshift-network-diagnostics/network-check-source-55646444c4-trplf] creating logical port openshift-network-diagnostics_network-check-source-55646444c4-trplf for pod on switch crc\\\\nI0321 03:47:53.579415 6704 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI0321 03:47:53.579424 6704 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI0321 03:47:53.579430 6704 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0321 03:47:53.579414 6704 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0321 03:47:53.579473 6704 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jrr5c\\\\nI0321 03:47:53.579505 6704 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jrr5c\\\\nF0321 03:47:53.579530 6704 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-cpfzk_openshift-ovn-kubernetes(08dfc393-0ddb-4bde-9b1f-2a48549f4549)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86328f182f7db62c6133733a4a4b9171f479c5b134c11250bb1839234544f485\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cpfzk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:04Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:04 crc kubenswrapper[4685]: I0321 03:48:04.272896 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mlsb2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b79f01f-bf05-4f7d-b816-6ef01f21e949\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://415d35423a66454f5ed06ba8facb7adbda91fae370e4d1d8a022cf7002e7b8bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6tf6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mlsb2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:04Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:04 crc kubenswrapper[4685]: I0321 03:48:04.292774 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df1f675def99930733ba3d2e467e3c07307a00cd8a10f0c3dee668de18f92dbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:04Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:04 crc kubenswrapper[4685]: I0321 03:48:04.308972 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:04 crc kubenswrapper[4685]: I0321 03:48:04.309055 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:04 crc kubenswrapper[4685]: I0321 03:48:04.309077 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:04 crc kubenswrapper[4685]: I0321 03:48:04.309109 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:04 crc kubenswrapper[4685]: I0321 03:48:04.309133 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:04Z","lastTransitionTime":"2026-03-21T03:48:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:04 crc kubenswrapper[4685]: I0321 03:48:04.315289 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:04Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:04 crc kubenswrapper[4685]: I0321 03:48:04.328078 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ztl6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dea34cf-d6c6-42fd-b4aa-8e175c6a78f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438ee4ce0e664eb23ee83b20eecbc81dd4f5ad12e50e98785be3696b58528952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sc2qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ztl6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:04Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:04 crc kubenswrapper[4685]: I0321 03:48:04.342771 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:04Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:04 crc kubenswrapper[4685]: I0321 03:48:04.359960 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jrr5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2521a678-ad6c-464b-bf7b-c4f6237c2822\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://749ea865e46b19e2475b94b696bb2be8bc27add3f2ba2420c38b9d67b2b8ddd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc29c132abc0c440dd96bf14118de27e46aa41322e61ce3d685eb43962330cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jrr5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:04Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:04 crc kubenswrapper[4685]: I0321 03:48:04.379797 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aab80bc7857e9a3d86f540c725f4e35d0d86f979abed1ea5c9cddb4a6263924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d981cec0491154ace10ce4f437d1101c8dd69c34129a42b7dd5af226f8b14b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:04Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:04 crc kubenswrapper[4685]: I0321 03:48:04.404688 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6xvsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5cb18bc-bda7-463e-98fe-6d8ff293b949\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b3a945a5915df9df43b15e02b30e3ffa4b0f0dd4e9283d54f80b4adb56f368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d8efc9f2ad837ea5b63d39bb04316943c7562b645f8f78b12e54d430f2a4a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4d8efc9f2ad837ea5b63d39bb04316943c7562b645f8f78b12e54d430f2a4a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d2cd7978a2b2d5a571096a1f0172b76414dadc75fb54395e05d1f344f04e1a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d2cd7978a2b2d5a571096a1f0172b76414dadc75fb54395e05d1f344f04e1a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2188c1bcb129415bc4e9a76339bad4f973c3778a1c5a0735b15a8e0ccc17f5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2188c1bcb129415bc4e9a76339bad4f973c3778a1c5a0735b15a8e0ccc17f5ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1417d49afce3f61158d9158a6c9a55e186a1032fb3c4ad275c63b6c9df81dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1417d49afce3f61158d9158a6c9a55e186a1032fb3c4ad275c63b6c9df81dd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f6759044b25748d81e11825c925bc83b0caa70bc3bc5f930554ac6f8a19cabb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f6759044b25748d81e11825c925bc83b0caa70bc3bc5f930554ac6f8a19cabb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6eda6a197766562ad75d8ccf23a1e8998c02f7b6f2ac62fb7cc1ff8384b3d0cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6eda6a197766562ad75d8ccf23a1e8998c02f7b6f2ac62fb7cc1ff8384b3d0cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6xvsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:04Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:04 crc kubenswrapper[4685]: I0321 03:48:04.412330 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:04 crc kubenswrapper[4685]: I0321 03:48:04.412380 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:04 crc kubenswrapper[4685]: I0321 03:48:04.412394 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:04 crc kubenswrapper[4685]: I0321 03:48:04.412416 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:04 crc kubenswrapper[4685]: I0321 03:48:04.412432 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:04Z","lastTransitionTime":"2026-03-21T03:48:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:04 crc kubenswrapper[4685]: I0321 03:48:04.422254 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7jcm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd9b1743-6b69-46d3-a429-6f83bf43317a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a947393ce78ff4c9dddd93eb5ee7cb034673ebb6f0391cb868b473028074aa46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x266z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7jcm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:04Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:04 crc kubenswrapper[4685]: I0321 03:48:04.435716 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cea46fe2-4e41-43ab-a069-cb30fb4e732c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f8af9aaf0940271c159c00b04ffb7a1dd9471caa93ad4c7bd4507461a095923\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5ntg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://682dda84970818843427fd441cf0359877cff12a6a10a7f7fad13d60c5ca62f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5ntg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7r9cg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:04Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:04 crc kubenswrapper[4685]: I0321 03:48:04.515013 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:04 crc kubenswrapper[4685]: I0321 03:48:04.515046 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:04 crc kubenswrapper[4685]: I0321 03:48:04.515054 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:04 crc kubenswrapper[4685]: I0321 03:48:04.515069 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:04 crc kubenswrapper[4685]: I0321 03:48:04.515081 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:04Z","lastTransitionTime":"2026-03-21T03:48:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:04 crc kubenswrapper[4685]: I0321 03:48:04.618035 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:04 crc kubenswrapper[4685]: I0321 03:48:04.618078 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:04 crc kubenswrapper[4685]: I0321 03:48:04.618089 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:04 crc kubenswrapper[4685]: I0321 03:48:04.618109 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:04 crc kubenswrapper[4685]: I0321 03:48:04.618123 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:04Z","lastTransitionTime":"2026-03-21T03:48:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:04 crc kubenswrapper[4685]: I0321 03:48:04.720459 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:04 crc kubenswrapper[4685]: I0321 03:48:04.720503 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:04 crc kubenswrapper[4685]: I0321 03:48:04.720515 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:04 crc kubenswrapper[4685]: I0321 03:48:04.720534 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:04 crc kubenswrapper[4685]: I0321 03:48:04.720547 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:04Z","lastTransitionTime":"2026-03-21T03:48:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:04 crc kubenswrapper[4685]: I0321 03:48:04.823022 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:04 crc kubenswrapper[4685]: I0321 03:48:04.823058 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:04 crc kubenswrapper[4685]: I0321 03:48:04.823067 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:04 crc kubenswrapper[4685]: I0321 03:48:04.823081 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:04 crc kubenswrapper[4685]: I0321 03:48:04.823091 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:04Z","lastTransitionTime":"2026-03-21T03:48:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:04 crc kubenswrapper[4685]: I0321 03:48:04.926289 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:04 crc kubenswrapper[4685]: I0321 03:48:04.926336 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:04 crc kubenswrapper[4685]: I0321 03:48:04.926347 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:04 crc kubenswrapper[4685]: I0321 03:48:04.926363 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:04 crc kubenswrapper[4685]: I0321 03:48:04.926376 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:04Z","lastTransitionTime":"2026-03-21T03:48:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:05 crc kubenswrapper[4685]: I0321 03:48:05.029353 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:05 crc kubenswrapper[4685]: I0321 03:48:05.029411 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:05 crc kubenswrapper[4685]: I0321 03:48:05.029426 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:05 crc kubenswrapper[4685]: I0321 03:48:05.029450 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:05 crc kubenswrapper[4685]: I0321 03:48:05.029467 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:05Z","lastTransitionTime":"2026-03-21T03:48:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:05 crc kubenswrapper[4685]: I0321 03:48:05.132585 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:05 crc kubenswrapper[4685]: I0321 03:48:05.132640 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:05 crc kubenswrapper[4685]: I0321 03:48:05.132657 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:05 crc kubenswrapper[4685]: I0321 03:48:05.132680 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:05 crc kubenswrapper[4685]: I0321 03:48:05.132698 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:05Z","lastTransitionTime":"2026-03-21T03:48:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:05 crc kubenswrapper[4685]: I0321 03:48:05.236311 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:05 crc kubenswrapper[4685]: I0321 03:48:05.236374 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:05 crc kubenswrapper[4685]: I0321 03:48:05.236390 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:05 crc kubenswrapper[4685]: I0321 03:48:05.236415 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:05 crc kubenswrapper[4685]: I0321 03:48:05.236433 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:05Z","lastTransitionTime":"2026-03-21T03:48:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:05 crc kubenswrapper[4685]: I0321 03:48:05.300440 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 03:48:05 crc kubenswrapper[4685]: I0321 03:48:05.300472 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v9rdl" Mar 21 03:48:05 crc kubenswrapper[4685]: I0321 03:48:05.300518 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 03:48:05 crc kubenswrapper[4685]: I0321 03:48:05.300467 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 03:48:05 crc kubenswrapper[4685]: E0321 03:48:05.300662 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 03:48:05 crc kubenswrapper[4685]: E0321 03:48:05.300903 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 03:48:05 crc kubenswrapper[4685]: E0321 03:48:05.301054 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v9rdl" podUID="fda9b1ff-e4a8-4d15-8f7b-2974991cd252" Mar 21 03:48:05 crc kubenswrapper[4685]: E0321 03:48:05.301190 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 03:48:05 crc kubenswrapper[4685]: I0321 03:48:05.339435 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:05 crc kubenswrapper[4685]: I0321 03:48:05.339504 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:05 crc kubenswrapper[4685]: I0321 03:48:05.339530 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:05 crc kubenswrapper[4685]: I0321 03:48:05.339564 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:05 crc kubenswrapper[4685]: I0321 03:48:05.339594 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:05Z","lastTransitionTime":"2026-03-21T03:48:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:05 crc kubenswrapper[4685]: I0321 03:48:05.442882 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:05 crc kubenswrapper[4685]: I0321 03:48:05.442942 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:05 crc kubenswrapper[4685]: I0321 03:48:05.442959 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:05 crc kubenswrapper[4685]: I0321 03:48:05.442984 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:05 crc kubenswrapper[4685]: I0321 03:48:05.443006 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:05Z","lastTransitionTime":"2026-03-21T03:48:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:05 crc kubenswrapper[4685]: I0321 03:48:05.545653 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:05 crc kubenswrapper[4685]: I0321 03:48:05.545714 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:05 crc kubenswrapper[4685]: I0321 03:48:05.545730 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:05 crc kubenswrapper[4685]: I0321 03:48:05.545757 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:05 crc kubenswrapper[4685]: I0321 03:48:05.545779 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:05Z","lastTransitionTime":"2026-03-21T03:48:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:05 crc kubenswrapper[4685]: I0321 03:48:05.648755 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:05 crc kubenswrapper[4685]: I0321 03:48:05.648832 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:05 crc kubenswrapper[4685]: I0321 03:48:05.648868 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:05 crc kubenswrapper[4685]: I0321 03:48:05.648892 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:05 crc kubenswrapper[4685]: I0321 03:48:05.648909 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:05Z","lastTransitionTime":"2026-03-21T03:48:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:05 crc kubenswrapper[4685]: I0321 03:48:05.751605 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:05 crc kubenswrapper[4685]: I0321 03:48:05.751667 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:05 crc kubenswrapper[4685]: I0321 03:48:05.751684 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:05 crc kubenswrapper[4685]: I0321 03:48:05.751708 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:05 crc kubenswrapper[4685]: I0321 03:48:05.751725 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:05Z","lastTransitionTime":"2026-03-21T03:48:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:05 crc kubenswrapper[4685]: I0321 03:48:05.854614 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:05 crc kubenswrapper[4685]: I0321 03:48:05.854745 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:05 crc kubenswrapper[4685]: I0321 03:48:05.854770 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:05 crc kubenswrapper[4685]: I0321 03:48:05.854808 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:05 crc kubenswrapper[4685]: I0321 03:48:05.854869 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:05Z","lastTransitionTime":"2026-03-21T03:48:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:05 crc kubenswrapper[4685]: I0321 03:48:05.957551 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:05 crc kubenswrapper[4685]: I0321 03:48:05.957599 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:05 crc kubenswrapper[4685]: I0321 03:48:05.957608 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:05 crc kubenswrapper[4685]: I0321 03:48:05.957625 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:05 crc kubenswrapper[4685]: I0321 03:48:05.957636 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:05Z","lastTransitionTime":"2026-03-21T03:48:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:06 crc kubenswrapper[4685]: I0321 03:48:06.060239 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:06 crc kubenswrapper[4685]: I0321 03:48:06.060298 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:06 crc kubenswrapper[4685]: I0321 03:48:06.060314 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:06 crc kubenswrapper[4685]: I0321 03:48:06.060339 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:06 crc kubenswrapper[4685]: I0321 03:48:06.060359 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:06Z","lastTransitionTime":"2026-03-21T03:48:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:06 crc kubenswrapper[4685]: I0321 03:48:06.163208 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:06 crc kubenswrapper[4685]: I0321 03:48:06.163275 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:06 crc kubenswrapper[4685]: I0321 03:48:06.163295 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:06 crc kubenswrapper[4685]: I0321 03:48:06.163323 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:06 crc kubenswrapper[4685]: I0321 03:48:06.163343 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:06Z","lastTransitionTime":"2026-03-21T03:48:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:06 crc kubenswrapper[4685]: I0321 03:48:06.275130 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:06 crc kubenswrapper[4685]: I0321 03:48:06.275215 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:06 crc kubenswrapper[4685]: I0321 03:48:06.275237 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:06 crc kubenswrapper[4685]: I0321 03:48:06.275292 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:06 crc kubenswrapper[4685]: I0321 03:48:06.275314 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:06Z","lastTransitionTime":"2026-03-21T03:48:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:06 crc kubenswrapper[4685]: I0321 03:48:06.316114 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 21 03:48:06 crc kubenswrapper[4685]: I0321 03:48:06.378876 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:06 crc kubenswrapper[4685]: I0321 03:48:06.378939 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:06 crc kubenswrapper[4685]: I0321 03:48:06.378961 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:06 crc kubenswrapper[4685]: I0321 03:48:06.378995 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:06 crc kubenswrapper[4685]: I0321 03:48:06.379014 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:06Z","lastTransitionTime":"2026-03-21T03:48:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:06 crc kubenswrapper[4685]: I0321 03:48:06.482808 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:06 crc kubenswrapper[4685]: I0321 03:48:06.482906 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:06 crc kubenswrapper[4685]: I0321 03:48:06.482928 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:06 crc kubenswrapper[4685]: I0321 03:48:06.482958 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:06 crc kubenswrapper[4685]: I0321 03:48:06.482980 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:06Z","lastTransitionTime":"2026-03-21T03:48:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:06 crc kubenswrapper[4685]: I0321 03:48:06.586112 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:06 crc kubenswrapper[4685]: I0321 03:48:06.586220 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:06 crc kubenswrapper[4685]: I0321 03:48:06.586248 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:06 crc kubenswrapper[4685]: I0321 03:48:06.586330 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:06 crc kubenswrapper[4685]: I0321 03:48:06.586401 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:06Z","lastTransitionTime":"2026-03-21T03:48:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:06 crc kubenswrapper[4685]: I0321 03:48:06.690370 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:06 crc kubenswrapper[4685]: I0321 03:48:06.690423 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:06 crc kubenswrapper[4685]: I0321 03:48:06.690439 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:06 crc kubenswrapper[4685]: I0321 03:48:06.690465 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:06 crc kubenswrapper[4685]: I0321 03:48:06.690482 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:06Z","lastTransitionTime":"2026-03-21T03:48:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:06 crc kubenswrapper[4685]: I0321 03:48:06.792797 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:06 crc kubenswrapper[4685]: I0321 03:48:06.792867 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:06 crc kubenswrapper[4685]: I0321 03:48:06.792880 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:06 crc kubenswrapper[4685]: I0321 03:48:06.792898 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:06 crc kubenswrapper[4685]: I0321 03:48:06.792911 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:06Z","lastTransitionTime":"2026-03-21T03:48:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:06 crc kubenswrapper[4685]: I0321 03:48:06.895286 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:06 crc kubenswrapper[4685]: I0321 03:48:06.895354 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:06 crc kubenswrapper[4685]: I0321 03:48:06.895377 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:06 crc kubenswrapper[4685]: I0321 03:48:06.895411 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:06 crc kubenswrapper[4685]: I0321 03:48:06.895434 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:06Z","lastTransitionTime":"2026-03-21T03:48:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:06 crc kubenswrapper[4685]: I0321 03:48:06.997828 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:06 crc kubenswrapper[4685]: I0321 03:48:06.997893 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:06 crc kubenswrapper[4685]: I0321 03:48:06.997909 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:06 crc kubenswrapper[4685]: I0321 03:48:06.997932 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:06 crc kubenswrapper[4685]: I0321 03:48:06.997949 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:06Z","lastTransitionTime":"2026-03-21T03:48:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:07 crc kubenswrapper[4685]: I0321 03:48:07.100362 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:07 crc kubenswrapper[4685]: I0321 03:48:07.100408 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:07 crc kubenswrapper[4685]: I0321 03:48:07.100419 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:07 crc kubenswrapper[4685]: I0321 03:48:07.100450 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:07 crc kubenswrapper[4685]: I0321 03:48:07.100463 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:07Z","lastTransitionTime":"2026-03-21T03:48:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:07 crc kubenswrapper[4685]: I0321 03:48:07.202938 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:07 crc kubenswrapper[4685]: I0321 03:48:07.202977 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:07 crc kubenswrapper[4685]: I0321 03:48:07.202991 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:07 crc kubenswrapper[4685]: I0321 03:48:07.203006 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:07 crc kubenswrapper[4685]: I0321 03:48:07.203017 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:07Z","lastTransitionTime":"2026-03-21T03:48:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:07 crc kubenswrapper[4685]: I0321 03:48:07.300178 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v9rdl" Mar 21 03:48:07 crc kubenswrapper[4685]: I0321 03:48:07.300195 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 03:48:07 crc kubenswrapper[4685]: I0321 03:48:07.300353 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 03:48:07 crc kubenswrapper[4685]: E0321 03:48:07.300494 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v9rdl" podUID="fda9b1ff-e4a8-4d15-8f7b-2974991cd252" Mar 21 03:48:07 crc kubenswrapper[4685]: I0321 03:48:07.300513 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 03:48:07 crc kubenswrapper[4685]: E0321 03:48:07.300663 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 03:48:07 crc kubenswrapper[4685]: E0321 03:48:07.300789 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 03:48:07 crc kubenswrapper[4685]: E0321 03:48:07.300901 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 03:48:07 crc kubenswrapper[4685]: I0321 03:48:07.304823 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:07 crc kubenswrapper[4685]: I0321 03:48:07.304913 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:07 crc kubenswrapper[4685]: I0321 03:48:07.304936 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:07 crc kubenswrapper[4685]: I0321 03:48:07.304965 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:07 crc kubenswrapper[4685]: I0321 03:48:07.304985 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:07Z","lastTransitionTime":"2026-03-21T03:48:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:07 crc kubenswrapper[4685]: I0321 03:48:07.408058 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:07 crc kubenswrapper[4685]: I0321 03:48:07.408117 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:07 crc kubenswrapper[4685]: I0321 03:48:07.408134 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:07 crc kubenswrapper[4685]: I0321 03:48:07.408158 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:07 crc kubenswrapper[4685]: I0321 03:48:07.408175 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:07Z","lastTransitionTime":"2026-03-21T03:48:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:07 crc kubenswrapper[4685]: I0321 03:48:07.511513 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:07 crc kubenswrapper[4685]: I0321 03:48:07.511561 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:07 crc kubenswrapper[4685]: I0321 03:48:07.511571 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:07 crc kubenswrapper[4685]: I0321 03:48:07.511591 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:07 crc kubenswrapper[4685]: I0321 03:48:07.511608 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:07Z","lastTransitionTime":"2026-03-21T03:48:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:07 crc kubenswrapper[4685]: I0321 03:48:07.614650 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:07 crc kubenswrapper[4685]: I0321 03:48:07.614782 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:07 crc kubenswrapper[4685]: I0321 03:48:07.614810 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:07 crc kubenswrapper[4685]: I0321 03:48:07.614878 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:07 crc kubenswrapper[4685]: I0321 03:48:07.614905 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:07Z","lastTransitionTime":"2026-03-21T03:48:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:07 crc kubenswrapper[4685]: I0321 03:48:07.717678 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:07 crc kubenswrapper[4685]: I0321 03:48:07.717746 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:07 crc kubenswrapper[4685]: I0321 03:48:07.717764 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:07 crc kubenswrapper[4685]: I0321 03:48:07.717792 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:07 crc kubenswrapper[4685]: I0321 03:48:07.717814 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:07Z","lastTransitionTime":"2026-03-21T03:48:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:07 crc kubenswrapper[4685]: I0321 03:48:07.821755 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:07 crc kubenswrapper[4685]: I0321 03:48:07.821832 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:07 crc kubenswrapper[4685]: I0321 03:48:07.821904 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:07 crc kubenswrapper[4685]: I0321 03:48:07.821942 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:07 crc kubenswrapper[4685]: I0321 03:48:07.821971 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:07Z","lastTransitionTime":"2026-03-21T03:48:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:07 crc kubenswrapper[4685]: I0321 03:48:07.924232 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:07 crc kubenswrapper[4685]: I0321 03:48:07.924288 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:07 crc kubenswrapper[4685]: I0321 03:48:07.924299 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:07 crc kubenswrapper[4685]: I0321 03:48:07.924319 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:07 crc kubenswrapper[4685]: I0321 03:48:07.924337 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:07Z","lastTransitionTime":"2026-03-21T03:48:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:08 crc kubenswrapper[4685]: I0321 03:48:08.027064 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:08 crc kubenswrapper[4685]: I0321 03:48:08.027152 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:08 crc kubenswrapper[4685]: I0321 03:48:08.027173 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:08 crc kubenswrapper[4685]: I0321 03:48:08.027198 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:08 crc kubenswrapper[4685]: I0321 03:48:08.027215 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:08Z","lastTransitionTime":"2026-03-21T03:48:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:08 crc kubenswrapper[4685]: I0321 03:48:08.130135 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:08 crc kubenswrapper[4685]: I0321 03:48:08.130188 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:08 crc kubenswrapper[4685]: I0321 03:48:08.130203 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:08 crc kubenswrapper[4685]: I0321 03:48:08.130229 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:08 crc kubenswrapper[4685]: I0321 03:48:08.130245 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:08Z","lastTransitionTime":"2026-03-21T03:48:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:08 crc kubenswrapper[4685]: E0321 03:48:08.231311 4685 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 21 03:48:08 crc kubenswrapper[4685]: I0321 03:48:08.314219 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cedbf5aa59177e67115d9dbf436805563022b6cebdd1c909d460d7997af5333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:08Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:08 crc kubenswrapper[4685]: I0321 03:48:08.333820 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fd9d618-b4ed-4942-b915-76dc59fb834a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba587c4fe2f05966282b50ba5236b9f3d9ef6de63f72c70ae9f7a5222cb8b904\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60c96edd458d05f217a2e9f07a44bd221303d821a790382a82cff0b912d48f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8583e9b5dff82d5df52e281ba4069e9259b1c8fe3d1b8121d0e9f3f9e97d47b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf0ba4af6f1f48fcb9ccf07dec53dc3ff1835a83dbf535bc48feb68fd646e78f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c896022c1e0726ccc5a54da9c432a0ff4082158ad76abcea60c0a1427c69134b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T03:47:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 03:47:19.136464 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 03:47:19.136735 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 03:47:19.138007 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2287411523/tls.crt::/tmp/serving-cert-2287411523/tls.key\\\\\\\"\\\\nI0321 03:47:19.320330 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 03:47:19.322862 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 03:47:19.322879 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 03:47:19.322902 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 03:47:19.322907 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 03:47:19.326922 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0321 03:47:19.326936 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 03:47:19.326989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 03:47:19.326999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 03:47:19.327009 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0321 03:47:19.327019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 03:47:19.327030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 03:47:19.327039 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 03:47:19.328267 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:18Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:48:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://182ad351b6c632ea64087d4784ea919d5c21165dc5d0373fa35db7f7f1eea435\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52df5756c4d3e8d1087a5a38b3e2b9966fb492d666c43773aa12a4cd83f6158d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52df5756c4d3e8d1087a5a38b3e2b9966fb492d666c43773aa12a4cd83f6158d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:46:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:46:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:46:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:08Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:08 crc kubenswrapper[4685]: I0321 03:48:08.353543 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df1f675def99930733ba3d2e467e3c07307a00cd8a10f0c3dee668de18f92dbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:08Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:08 crc kubenswrapper[4685]: I0321 03:48:08.370491 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:08Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:08 crc kubenswrapper[4685]: I0321 03:48:08.387414 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:08Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:08 crc kubenswrapper[4685]: I0321 03:48:08.402371 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-v9rdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fda9b1ff-e4a8-4d15-8f7b-2974991cd252\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrczq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrczq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-v9rdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:08Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:08 crc kubenswrapper[4685]: E0321 03:48:08.405323 4685 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 03:48:08 crc kubenswrapper[4685]: I0321 03:48:08.436151 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08dfc393-0ddb-4bde-9b1f-2a48549f4549\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ad07f9e8bc73e15364d59e6baf04c54550518537bac8cbfcf180b423a1da4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://beb9850d14f3c89154f1960550eed4073643c6b70832bb3958176ac924f5a9fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34752fc66f7ecfa5c0a9a761bc6a69d21ab1fa332ecd79948cb6b1ea34a8a8b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2215b819a38926d6f34fede86a32762a942f00bc552ca9c0b8df5f17ed47c10a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d89981c8d56dfb581289dc794c8402f5fde866fd6948c2c0a1940eb3ba99a320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://473ea0e8600e10ea2a695b213d69795c56c1a399cbc5dc82ddb34f7261a034e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1017f2ba3943e2b7de1a63f4eebf7771e232b07b86bd3208e81bd5d193b31d25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1017f2ba3943e2b7de1a63f4eebf7771e232b07b86bd3208e81bd5d193b31d25\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T03:47:53Z\\\",\\\"message\\\":\\\"troller_pods.go:477] [default/openshift-network-diagnostics/network-check-source-55646444c4-trplf] creating logical port openshift-network-diagnostics_network-check-source-55646444c4-trplf for pod on switch crc\\\\nI0321 03:47:53.579415 6704 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI0321 03:47:53.579424 6704 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI0321 03:47:53.579430 6704 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0321 03:47:53.579414 6704 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0321 03:47:53.579473 6704 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jrr5c\\\\nI0321 03:47:53.579505 6704 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jrr5c\\\\nF0321 03:47:53.579530 6704 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-cpfzk_openshift-ovn-kubernetes(08dfc393-0ddb-4bde-9b1f-2a48549f4549)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86328f182f7db62c6133733a4a4b9171f479c5b134c11250bb1839234544f485\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cpfzk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:08Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:08 crc kubenswrapper[4685]: I0321 03:48:08.449708 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mlsb2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b79f01f-bf05-4f7d-b816-6ef01f21e949\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://415d35423a66454f5ed06ba8facb7adbda91fae370e4d1d8a022cf7002e7b8bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6tf6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mlsb2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:08Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:08 crc kubenswrapper[4685]: I0321 03:48:08.460798 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ztl6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dea34cf-d6c6-42fd-b4aa-8e175c6a78f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438ee4ce0e664eb23ee83b20eecbc81dd4f5ad12e50e98785be3696b58528952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sc2qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ztl6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:08Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:08 crc kubenswrapper[4685]: I0321 03:48:08.473577 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jrr5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2521a678-ad6c-464b-bf7b-c4f6237c2822\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://749ea865e46b19e2475b94b696bb2be8bc27add3f2ba2420c38b9d67b2b8ddd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc29c132abc0c440dd96bf14118de27e46aa41322e61ce3d685eb43962330cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jrr5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:08Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:08 crc kubenswrapper[4685]: I0321 03:48:08.485798 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29e0ccb5-0299-4b2a-8138-694a1bb786db\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0be6102e44775f379b67814ed8f979bf81153bd68b519bc5c5e6cd2e3cb8169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://229992b297c1eb9aa6f92e56505bdf819e392fb777636759549854330bf022ab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T03:46:41Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0321 03:46:10.460102 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0321 03:46:10.462103 1 observer_polling.go:159] Starting file observer\\\\nI0321 03:46:10.493091 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0321 03:46:10.497028 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0321 03:46:40.990647 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T03:46:09Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa74801f196e57343a98011d671414abf1f1ebf4d7b962be522f0b6cad777acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41664fe1fb74ab3c38846eaf0a2b0bf46f5b0c4a2d2b3dfadb1a5e02e5a66e81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f2aa64984fe04680e1c00621e521e1a2a6eb5c2c2696c88fee33cc6eaa528f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:46:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:08Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:08 crc kubenswrapper[4685]: I0321 03:48:08.501859 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:08Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:08 crc kubenswrapper[4685]: I0321 03:48:08.518206 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7jcm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd9b1743-6b69-46d3-a429-6f83bf43317a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a947393ce78ff4c9dddd93eb5ee7cb034673ebb6f0391cb868b473028074aa46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x266z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7jcm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:08Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:08 crc kubenswrapper[4685]: I0321 03:48:08.533136 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cea46fe2-4e41-43ab-a069-cb30fb4e732c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f8af9aaf0940271c159c00b04ffb7a1dd9471caa93ad4c7bd4507461a095923\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5ntg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://682dda84970818843427fd441cf0359877cff12a6a10a7f7fad13d60c5ca62f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5ntg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7r9cg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:08Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:08 crc kubenswrapper[4685]: I0321 03:48:08.547351 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aab80bc7857e9a3d86f540c725f4e35d0d86f979abed1ea5c9cddb4a6263924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d981cec0491154ace10ce4f437d1101c8dd69c34129a42b7dd5af226f8b14b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:08Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:08 crc kubenswrapper[4685]: I0321 03:48:08.566419 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6xvsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5cb18bc-bda7-463e-98fe-6d8ff293b949\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b3a945a5915df9df43b15e02b30e3ffa4b0f0dd4e9283d54f80b4adb56f368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d8efc9f2ad837ea5b63d39bb04316943c7562b645f8f78b12e54d430f2a4a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4d8efc9f2ad837ea5b63d39bb04316943c7562b645f8f78b12e54d430f2a4a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d2cd7978a2b2d5a571096a1f0172b76414dadc75fb54395e05d1f344f04e1a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d2cd7978a2b2d5a571096a1f0172b76414dadc75fb54395e05d1f344f04e1a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2188c1bcb129415bc4e9a76339bad4f973c3778a1c5a0735b15a8e0ccc17f5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2188c1bcb129415bc4e9a76339bad4f973c3778a1c5a0735b15a8e0ccc17f5ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1417d49afce3f61158d9158a6c9a55e186a1032fb3c4ad275c63b6c9df81dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1417d49afce3f61158d9158a6c9a55e186a1032fb3c4ad275c63b6c9df81dd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f6759044b25748d81e11825c925bc83b0caa70bc3bc5f930554ac6f8a19cabb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f6759044b25748d81e11825c925bc83b0caa70bc3bc5f930554ac6f8a19cabb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6eda6a197766562ad75d8ccf23a1e8998c02f7b6f2ac62fb7cc1ff8384b3d0cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6eda6a197766562ad75d8ccf23a1e8998c02f7b6f2ac62fb7cc1ff8384b3d0cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6xvsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:08Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:09 crc kubenswrapper[4685]: I0321 03:48:09.300523 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 03:48:09 crc kubenswrapper[4685]: I0321 03:48:09.300639 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 03:48:09 crc kubenswrapper[4685]: I0321 03:48:09.300738 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v9rdl" Mar 21 03:48:09 crc kubenswrapper[4685]: E0321 03:48:09.300727 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 03:48:09 crc kubenswrapper[4685]: I0321 03:48:09.301186 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 03:48:09 crc kubenswrapper[4685]: E0321 03:48:09.301311 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v9rdl" podUID="fda9b1ff-e4a8-4d15-8f7b-2974991cd252" Mar 21 03:48:09 crc kubenswrapper[4685]: E0321 03:48:09.301447 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 03:48:09 crc kubenswrapper[4685]: E0321 03:48:09.301576 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 03:48:09 crc kubenswrapper[4685]: I0321 03:48:09.301884 4685 scope.go:117] "RemoveContainer" containerID="1017f2ba3943e2b7de1a63f4eebf7771e232b07b86bd3208e81bd5d193b31d25" Mar 21 03:48:10 crc kubenswrapper[4685]: I0321 03:48:10.175984 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cpfzk_08dfc393-0ddb-4bde-9b1f-2a48549f4549/ovnkube-controller/1.log" Mar 21 03:48:10 crc kubenswrapper[4685]: I0321 03:48:10.178991 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" event={"ID":"08dfc393-0ddb-4bde-9b1f-2a48549f4549","Type":"ContainerStarted","Data":"db2f167cea866dc3e14a9c0f6e343206b5e7d73d4c6971ddae722a4a29479cf1"} Mar 21 03:48:10 crc kubenswrapper[4685]: I0321 03:48:10.179484 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" Mar 21 03:48:10 crc kubenswrapper[4685]: I0321 03:48:10.200109 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:10Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:10 crc kubenswrapper[4685]: I0321 03:48:10.211558 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:10Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:10 crc kubenswrapper[4685]: I0321 03:48:10.222335 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-v9rdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fda9b1ff-e4a8-4d15-8f7b-2974991cd252\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrczq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrczq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-v9rdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:10Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:10 crc kubenswrapper[4685]: I0321 03:48:10.241169 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08dfc393-0ddb-4bde-9b1f-2a48549f4549\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ad07f9e8bc73e15364d59e6baf04c54550518537bac8cbfcf180b423a1da4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://beb9850d14f3c89154f1960550eed4073643c6b70832bb3958176ac924f5a9fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34752fc66f7ecfa5c0a9a761bc6a69d21ab1fa332ecd79948cb6b1ea34a8a8b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2215b819a38926d6f34fede86a32762a942f00bc552ca9c0b8df5f17ed47c10a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d89981c8d56dfb581289dc794c8402f5fde866fd6948c2c0a1940eb3ba99a320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://473ea0e8600e10ea2a695b213d69795c56c1a399cbc5dc82ddb34f7261a034e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db2f167cea866dc3e14a9c0f6e343206b5e7d73d4c6971ddae722a4a29479cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1017f2ba3943e2b7de1a63f4eebf7771e232b07b86bd3208e81bd5d193b31d25\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T03:47:53Z\\\",\\\"message\\\":\\\"troller_pods.go:477] [default/openshift-network-diagnostics/network-check-source-55646444c4-trplf] creating logical port openshift-network-diagnostics_network-check-source-55646444c4-trplf for pod on switch crc\\\\nI0321 03:47:53.579415 6704 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI0321 03:47:53.579424 6704 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI0321 03:47:53.579430 6704 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0321 03:47:53.579414 6704 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0321 03:47:53.579473 6704 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jrr5c\\\\nI0321 03:47:53.579505 6704 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jrr5c\\\\nF0321 03:47:53.579530 6704 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86328f182f7db62c6133733a4a4b9171f479c5b134c11250bb1839234544f485\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cpfzk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:10Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:10 crc kubenswrapper[4685]: I0321 03:48:10.252167 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mlsb2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b79f01f-bf05-4f7d-b816-6ef01f21e949\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://415d35423a66454f5ed06ba8facb7adbda91fae370e4d1d8a022cf7002e7b8bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6tf6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mlsb2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:10Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:10 crc kubenswrapper[4685]: I0321 03:48:10.264250 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df1f675def99930733ba3d2e467e3c07307a00cd8a10f0c3dee668de18f92dbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:10Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:10 crc kubenswrapper[4685]: I0321 03:48:10.278252 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ztl6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dea34cf-d6c6-42fd-b4aa-8e175c6a78f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438ee4ce0e664eb23ee83b20eecbc81dd4f5ad12e50e98785be3696b58528952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sc2qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ztl6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:10Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:10 crc kubenswrapper[4685]: I0321 03:48:10.290650 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29e0ccb5-0299-4b2a-8138-694a1bb786db\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0be6102e44775f379b67814ed8f979bf81153bd68b519bc5c5e6cd2e3cb8169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://229992b297c1eb9aa6f92e56505bdf819e392fb777636759549854330bf022ab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T03:46:41Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0321 03:46:10.460102 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0321 03:46:10.462103 1 observer_polling.go:159] Starting file observer\\\\nI0321 03:46:10.493091 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0321 03:46:10.497028 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0321 03:46:40.990647 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T03:46:09Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa74801f196e57343a98011d671414abf1f1ebf4d7b962be522f0b6cad777acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41664fe1fb74ab3c38846eaf0a2b0bf46f5b0c4a2d2b3dfadb1a5e02e5a66e81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f2aa64984fe04680e1c00621e521e1a2a6eb5c2c2696c88fee33cc6eaa528f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:46:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:10Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:10 crc kubenswrapper[4685]: I0321 03:48:10.306116 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:10Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:10 crc kubenswrapper[4685]: I0321 03:48:10.317672 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jrr5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2521a678-ad6c-464b-bf7b-c4f6237c2822\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://749ea865e46b19e2475b94b696bb2be8bc27add3f2ba2420c38b9d67b2b8ddd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc29c132abc0c440dd96bf14118de27e46aa41322e61ce3d685eb43962330cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jrr5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:10Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:10 crc kubenswrapper[4685]: I0321 03:48:10.329255 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cea46fe2-4e41-43ab-a069-cb30fb4e732c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f8af9aaf0940271c159c00b04ffb7a1dd9471caa93ad4c7bd4507461a095923\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5ntg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://682dda84970818843427fd441cf0359877cff12a6a10a7f7fad13d60c5ca62f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5ntg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7r9cg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:10Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:10 crc kubenswrapper[4685]: I0321 03:48:10.341357 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aab80bc7857e9a3d86f540c725f4e35d0d86f979abed1ea5c9cddb4a6263924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d981cec0491154ace10ce4f437d1101c8dd69c34129a42b7dd5af226f8b14b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:10Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:10 crc kubenswrapper[4685]: I0321 03:48:10.354684 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6xvsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5cb18bc-bda7-463e-98fe-6d8ff293b949\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b3a945a5915df9df43b15e02b30e3ffa4b0f0dd4e9283d54f80b4adb56f368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d8efc9f2ad837ea5b63d39bb04316943c7562b645f8f78b12e54d430f2a4a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4d8efc9f2ad837ea5b63d39bb04316943c7562b645f8f78b12e54d430f2a4a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d2cd7978a2b2d5a571096a1f0172b76414dadc75fb54395e05d1f344f04e1a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d2cd7978a2b2d5a571096a1f0172b76414dadc75fb54395e05d1f344f04e1a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2188c1bcb129415bc4e9a76339bad4f973c3778a1c5a0735b15a8e0ccc17f5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2188c1bcb129415bc4e9a76339bad4f973c3778a1c5a0735b15a8e0ccc17f5ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1417d49afce3f61158d9158a6c9a55e186a1032fb3c4ad275c63b6c9df81dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1417d49afce3f61158d9158a6c9a55e186a1032fb3c4ad275c63b6c9df81dd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f6759044b25748d81e11825c925bc83b0caa70bc3bc5f930554ac6f8a19cabb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f6759044b25748d81e11825c925bc83b0caa70bc3bc5f930554ac6f8a19cabb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6eda6a197766562ad75d8ccf23a1e8998c02f7b6f2ac62fb7cc1ff8384b3d0cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6eda6a197766562ad75d8ccf23a1e8998c02f7b6f2ac62fb7cc1ff8384b3d0cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6xvsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:10Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:10 crc kubenswrapper[4685]: I0321 03:48:10.367897 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7jcm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd9b1743-6b69-46d3-a429-6f83bf43317a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a947393ce78ff4c9dddd93eb5ee7cb034673ebb6f0391cb868b473028074aa46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x266z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7jcm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:10Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:10 crc kubenswrapper[4685]: I0321 03:48:10.380969 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fd9d618-b4ed-4942-b915-76dc59fb834a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba587c4fe2f05966282b50ba5236b9f3d9ef6de63f72c70ae9f7a5222cb8b904\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60c96edd458d05f217a2e9f07a44bd221303d821a790382a82cff0b912d48f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8583e9b5dff82d5df52e281ba4069e9259b1c8fe3d1b8121d0e9f3f9e97d47b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf0ba4af6f1f48fcb9ccf07dec53dc3ff1835a83dbf535bc48feb68fd646e78f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c896022c1e0726ccc5a54da9c432a0ff4082158ad76abcea60c0a1427c69134b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T03:47:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 03:47:19.136464 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 03:47:19.136735 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 03:47:19.138007 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2287411523/tls.crt::/tmp/serving-cert-2287411523/tls.key\\\\\\\"\\\\nI0321 03:47:19.320330 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 03:47:19.322862 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 03:47:19.322879 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 03:47:19.322902 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 03:47:19.322907 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 03:47:19.326922 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0321 03:47:19.326936 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 03:47:19.326989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 03:47:19.326999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 03:47:19.327009 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0321 03:47:19.327019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 03:47:19.327030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 03:47:19.327039 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 03:47:19.328267 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:18Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:48:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://182ad351b6c632ea64087d4784ea919d5c21165dc5d0373fa35db7f7f1eea435\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52df5756c4d3e8d1087a5a38b3e2b9966fb492d666c43773aa12a4cd83f6158d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52df5756c4d3e8d1087a5a38b3e2b9966fb492d666c43773aa12a4cd83f6158d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:46:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:46:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:46:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:10Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:10 crc kubenswrapper[4685]: I0321 03:48:10.393184 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cedbf5aa59177e67115d9dbf436805563022b6cebdd1c909d460d7997af5333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:10Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:10 crc kubenswrapper[4685]: I0321 03:48:10.473615 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:10 crc kubenswrapper[4685]: I0321 03:48:10.473685 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:10 crc kubenswrapper[4685]: I0321 03:48:10.473709 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:10 crc kubenswrapper[4685]: I0321 03:48:10.473741 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:10 crc kubenswrapper[4685]: I0321 03:48:10.473765 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:10Z","lastTransitionTime":"2026-03-21T03:48:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:10 crc kubenswrapper[4685]: E0321 03:48:10.491620 4685 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8bd8455e-cd7f-4a01-9ab2-39696fd22c82\\\",\\\"systemUUID\\\":\\\"9e08e2bc-a5ba-4ecf-a5ec-5e30e6b4a1fa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:10Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:10 crc kubenswrapper[4685]: I0321 03:48:10.496339 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:10 crc kubenswrapper[4685]: I0321 03:48:10.496388 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:10 crc kubenswrapper[4685]: I0321 03:48:10.496398 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:10 crc kubenswrapper[4685]: I0321 03:48:10.496417 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:10 crc kubenswrapper[4685]: I0321 03:48:10.496428 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:10Z","lastTransitionTime":"2026-03-21T03:48:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:10 crc kubenswrapper[4685]: E0321 03:48:10.511730 4685 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8bd8455e-cd7f-4a01-9ab2-39696fd22c82\\\",\\\"systemUUID\\\":\\\"9e08e2bc-a5ba-4ecf-a5ec-5e30e6b4a1fa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:10Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:10 crc kubenswrapper[4685]: I0321 03:48:10.516769 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:10 crc kubenswrapper[4685]: I0321 03:48:10.516827 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:10 crc kubenswrapper[4685]: I0321 03:48:10.516868 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:10 crc kubenswrapper[4685]: I0321 03:48:10.516893 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:10 crc kubenswrapper[4685]: I0321 03:48:10.516912 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:10Z","lastTransitionTime":"2026-03-21T03:48:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:10 crc kubenswrapper[4685]: E0321 03:48:10.531261 4685 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8bd8455e-cd7f-4a01-9ab2-39696fd22c82\\\",\\\"systemUUID\\\":\\\"9e08e2bc-a5ba-4ecf-a5ec-5e30e6b4a1fa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:10Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:10 crc kubenswrapper[4685]: I0321 03:48:10.536354 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:10 crc kubenswrapper[4685]: I0321 03:48:10.536399 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:10 crc kubenswrapper[4685]: I0321 03:48:10.536412 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:10 crc kubenswrapper[4685]: I0321 03:48:10.536433 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:10 crc kubenswrapper[4685]: I0321 03:48:10.537040 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:10Z","lastTransitionTime":"2026-03-21T03:48:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:10 crc kubenswrapper[4685]: E0321 03:48:10.551339 4685 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8bd8455e-cd7f-4a01-9ab2-39696fd22c82\\\",\\\"systemUUID\\\":\\\"9e08e2bc-a5ba-4ecf-a5ec-5e30e6b4a1fa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:10Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:10 crc kubenswrapper[4685]: I0321 03:48:10.556174 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:10 crc kubenswrapper[4685]: I0321 03:48:10.556212 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:10 crc kubenswrapper[4685]: I0321 03:48:10.556225 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:10 crc kubenswrapper[4685]: I0321 03:48:10.556244 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:10 crc kubenswrapper[4685]: I0321 03:48:10.556258 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:10Z","lastTransitionTime":"2026-03-21T03:48:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:10 crc kubenswrapper[4685]: E0321 03:48:10.574872 4685 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8bd8455e-cd7f-4a01-9ab2-39696fd22c82\\\",\\\"systemUUID\\\":\\\"9e08e2bc-a5ba-4ecf-a5ec-5e30e6b4a1fa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:10Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:10 crc kubenswrapper[4685]: E0321 03:48:10.574989 4685 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 21 03:48:11 crc kubenswrapper[4685]: I0321 03:48:11.031463 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 03:48:11 crc kubenswrapper[4685]: E0321 03:48:11.031667 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 03:48:43.031621269 +0000 UTC m=+155.508690091 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:48:11 crc kubenswrapper[4685]: I0321 03:48:11.031769 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 03:48:11 crc kubenswrapper[4685]: E0321 03:48:11.031928 4685 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 21 03:48:11 crc kubenswrapper[4685]: E0321 03:48:11.032017 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 03:48:43.031996149 +0000 UTC m=+155.509064951 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 21 03:48:11 crc kubenswrapper[4685]: I0321 03:48:11.132566 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 03:48:11 crc kubenswrapper[4685]: I0321 03:48:11.132633 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 03:48:11 crc kubenswrapper[4685]: I0321 03:48:11.132712 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 03:48:11 crc kubenswrapper[4685]: E0321 03:48:11.132754 4685 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 21 03:48:11 crc kubenswrapper[4685]: E0321 03:48:11.132779 4685 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 21 03:48:11 crc kubenswrapper[4685]: I0321 03:48:11.132778 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fda9b1ff-e4a8-4d15-8f7b-2974991cd252-metrics-certs\") pod \"network-metrics-daemon-v9rdl\" (UID: \"fda9b1ff-e4a8-4d15-8f7b-2974991cd252\") " pod="openshift-multus/network-metrics-daemon-v9rdl" Mar 21 03:48:11 crc kubenswrapper[4685]: E0321 03:48:11.132792 4685 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 03:48:11 crc kubenswrapper[4685]: E0321 03:48:11.132955 4685 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 21 03:48:11 crc kubenswrapper[4685]: E0321 03:48:11.132808 4685 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 21 03:48:11 crc kubenswrapper[4685]: E0321 03:48:11.132865 4685 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 21 03:48:11 crc kubenswrapper[4685]: E0321 03:48:11.133028 4685 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 21 03:48:11 crc kubenswrapper[4685]: E0321 03:48:11.133039 4685 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 03:48:11 crc kubenswrapper[4685]: E0321 03:48:11.133009 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-21 03:48:43.132976238 +0000 UTC m=+155.610045070 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 03:48:11 crc kubenswrapper[4685]: E0321 03:48:11.133110 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fda9b1ff-e4a8-4d15-8f7b-2974991cd252-metrics-certs podName:fda9b1ff-e4a8-4d15-8f7b-2974991cd252 nodeName:}" failed. No retries permitted until 2026-03-21 03:48:43.133079711 +0000 UTC m=+155.610148553 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fda9b1ff-e4a8-4d15-8f7b-2974991cd252-metrics-certs") pod "network-metrics-daemon-v9rdl" (UID: "fda9b1ff-e4a8-4d15-8f7b-2974991cd252") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 21 03:48:11 crc kubenswrapper[4685]: E0321 03:48:11.133157 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 03:48:43.133138263 +0000 UTC m=+155.610207095 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 21 03:48:11 crc kubenswrapper[4685]: E0321 03:48:11.133195 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-21 03:48:43.133178624 +0000 UTC m=+155.610247456 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 03:48:11 crc kubenswrapper[4685]: I0321 03:48:11.183189 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cpfzk_08dfc393-0ddb-4bde-9b1f-2a48549f4549/ovnkube-controller/2.log" Mar 21 03:48:11 crc kubenswrapper[4685]: I0321 03:48:11.183776 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cpfzk_08dfc393-0ddb-4bde-9b1f-2a48549f4549/ovnkube-controller/1.log" Mar 21 03:48:11 crc kubenswrapper[4685]: I0321 03:48:11.186919 4685 generic.go:334] "Generic (PLEG): container finished" podID="08dfc393-0ddb-4bde-9b1f-2a48549f4549" containerID="db2f167cea866dc3e14a9c0f6e343206b5e7d73d4c6971ddae722a4a29479cf1" exitCode=1 Mar 21 03:48:11 crc kubenswrapper[4685]: I0321 03:48:11.186975 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" event={"ID":"08dfc393-0ddb-4bde-9b1f-2a48549f4549","Type":"ContainerDied","Data":"db2f167cea866dc3e14a9c0f6e343206b5e7d73d4c6971ddae722a4a29479cf1"} Mar 21 03:48:11 crc kubenswrapper[4685]: I0321 03:48:11.187023 4685 scope.go:117] "RemoveContainer" containerID="1017f2ba3943e2b7de1a63f4eebf7771e232b07b86bd3208e81bd5d193b31d25" Mar 21 03:48:11 crc kubenswrapper[4685]: I0321 03:48:11.187895 4685 scope.go:117] "RemoveContainer" containerID="db2f167cea866dc3e14a9c0f6e343206b5e7d73d4c6971ddae722a4a29479cf1" Mar 21 03:48:11 crc kubenswrapper[4685]: E0321 03:48:11.188664 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-cpfzk_openshift-ovn-kubernetes(08dfc393-0ddb-4bde-9b1f-2a48549f4549)\"" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" podUID="08dfc393-0ddb-4bde-9b1f-2a48549f4549" Mar 21 03:48:11 crc kubenswrapper[4685]: I0321 03:48:11.212557 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fd9d618-b4ed-4942-b915-76dc59fb834a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba587c4fe2f05966282b50ba5236b9f3d9ef6de63f72c70ae9f7a5222cb8b904\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60c96edd458d05f217a2e9f07a44bd221303d821a790382a82cff0b912d48f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8583e9b5dff82d5df52e281ba4069e9259b1c8fe3d1b8121d0e9f3f9e97d47b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf0ba4af6f1f48fcb9ccf07dec53dc3ff1835a83dbf535bc48feb68fd646e78f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c896022c1e0726ccc5a54da9c432a0ff4082158ad76abcea60c0a1427c69134b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T03:47:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 03:47:19.136464 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 03:47:19.136735 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 03:47:19.138007 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2287411523/tls.crt::/tmp/serving-cert-2287411523/tls.key\\\\\\\"\\\\nI0321 03:47:19.320330 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 03:47:19.322862 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 03:47:19.322879 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 03:47:19.322902 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 03:47:19.322907 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 03:47:19.326922 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0321 03:47:19.326936 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 03:47:19.326989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 03:47:19.326999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 03:47:19.327009 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0321 03:47:19.327019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 03:47:19.327030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 03:47:19.327039 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 03:47:19.328267 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:18Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:48:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://182ad351b6c632ea64087d4784ea919d5c21165dc5d0373fa35db7f7f1eea435\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52df5756c4d3e8d1087a5a38b3e2b9966fb492d666c43773aa12a4cd83f6158d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52df5756c4d3e8d1087a5a38b3e2b9966fb492d666c43773aa12a4cd83f6158d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:46:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:46:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:46:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:11Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:11 crc kubenswrapper[4685]: I0321 03:48:11.225948 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cedbf5aa59177e67115d9dbf436805563022b6cebdd1c909d460d7997af5333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:11Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:11 crc kubenswrapper[4685]: I0321 03:48:11.240500 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-v9rdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fda9b1ff-e4a8-4d15-8f7b-2974991cd252\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrczq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrczq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-v9rdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:11Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:11 crc kubenswrapper[4685]: I0321 03:48:11.262394 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08dfc393-0ddb-4bde-9b1f-2a48549f4549\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ad07f9e8bc73e15364d59e6baf04c54550518537bac8cbfcf180b423a1da4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://beb9850d14f3c89154f1960550eed4073643c6b70832bb3958176ac924f5a9fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34752fc66f7ecfa5c0a9a761bc6a69d21ab1fa332ecd79948cb6b1ea34a8a8b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2215b819a38926d6f34fede86a32762a942f00bc552ca9c0b8df5f17ed47c10a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d89981c8d56dfb581289dc794c8402f5fde866fd6948c2c0a1940eb3ba99a320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://473ea0e8600e10ea2a695b213d69795c56c1a399cbc5dc82ddb34f7261a034e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db2f167cea866dc3e14a9c0f6e343206b5e7d73d4c6971ddae722a4a29479cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1017f2ba3943e2b7de1a63f4eebf7771e232b07b86bd3208e81bd5d193b31d25\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T03:47:53Z\\\",\\\"message\\\":\\\"troller_pods.go:477] [default/openshift-network-diagnostics/network-check-source-55646444c4-trplf] creating logical port openshift-network-diagnostics_network-check-source-55646444c4-trplf for pod on switch crc\\\\nI0321 03:47:53.579415 6704 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI0321 03:47:53.579424 6704 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI0321 03:47:53.579430 6704 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0321 03:47:53.579414 6704 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0321 03:47:53.579473 6704 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jrr5c\\\\nI0321 03:47:53.579505 6704 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jrr5c\\\\nF0321 03:47:53.579530 6704 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db2f167cea866dc3e14a9c0f6e343206b5e7d73d4c6971ddae722a4a29479cf1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T03:48:10Z\\\",\\\"message\\\":\\\".182158 6950 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0321 03:48:10.182201 6950 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0321 03:48:10.182262 6950 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0321 03:48:10.182349 6950 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0321 03:48:10.182419 6950 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0321 03:48:10.182472 6950 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0321 03:48:10.182494 6950 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0321 03:48:10.182550 6950 factory.go:656] Stopping watch factory\\\\nI0321 03:48:10.182584 6950 handler.go:208] Removed *v1.Node event handler 7\\\\nI0321 03:48:10.182225 6950 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0321 03:48:10.182774 6950 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0321 03:48:10.182795 6950 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0321 03:48:10.182807 6950 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0321 03:48:10.182867 6950 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0321 03:48:10.182891 6950 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T03:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86328f182f7db62c6133733a4a4b9171f479c5b134c11250bb1839234544f485\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cpfzk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:11Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:11 crc kubenswrapper[4685]: I0321 03:48:11.276460 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mlsb2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b79f01f-bf05-4f7d-b816-6ef01f21e949\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://415d35423a66454f5ed06ba8facb7adbda91fae370e4d1d8a022cf7002e7b8bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6tf6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mlsb2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:11Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:11 crc kubenswrapper[4685]: I0321 03:48:11.289967 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df1f675def99930733ba3d2e467e3c07307a00cd8a10f0c3dee668de18f92dbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:11Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:11 crc kubenswrapper[4685]: I0321 03:48:11.300345 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v9rdl" Mar 21 03:48:11 crc kubenswrapper[4685]: I0321 03:48:11.300411 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 03:48:11 crc kubenswrapper[4685]: I0321 03:48:11.300451 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 03:48:11 crc kubenswrapper[4685]: I0321 03:48:11.300345 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 03:48:11 crc kubenswrapper[4685]: E0321 03:48:11.300505 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v9rdl" podUID="fda9b1ff-e4a8-4d15-8f7b-2974991cd252" Mar 21 03:48:11 crc kubenswrapper[4685]: E0321 03:48:11.300638 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 03:48:11 crc kubenswrapper[4685]: I0321 03:48:11.300663 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:11Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:11 crc kubenswrapper[4685]: E0321 03:48:11.300816 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 03:48:11 crc kubenswrapper[4685]: E0321 03:48:11.300920 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 03:48:11 crc kubenswrapper[4685]: I0321 03:48:11.312779 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:11Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:11 crc kubenswrapper[4685]: I0321 03:48:11.322958 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ztl6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dea34cf-d6c6-42fd-b4aa-8e175c6a78f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438ee4ce0e664eb23ee83b20eecbc81dd4f5ad12e50e98785be3696b58528952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sc2qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ztl6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:11Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:11 crc kubenswrapper[4685]: I0321 03:48:11.337083 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29e0ccb5-0299-4b2a-8138-694a1bb786db\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0be6102e44775f379b67814ed8f979bf81153bd68b519bc5c5e6cd2e3cb8169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://229992b297c1eb9aa6f92e56505bdf819e392fb777636759549854330bf022ab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T03:46:41Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0321 03:46:10.460102 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0321 03:46:10.462103 1 observer_polling.go:159] Starting file observer\\\\nI0321 03:46:10.493091 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0321 03:46:10.497028 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0321 03:46:40.990647 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T03:46:09Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa74801f196e57343a98011d671414abf1f1ebf4d7b962be522f0b6cad777acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41664fe1fb74ab3c38846eaf0a2b0bf46f5b0c4a2d2b3dfadb1a5e02e5a66e81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f2aa64984fe04680e1c00621e521e1a2a6eb5c2c2696c88fee33cc6eaa528f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:46:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:11Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:11 crc kubenswrapper[4685]: I0321 03:48:11.350184 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:11Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:11 crc kubenswrapper[4685]: I0321 03:48:11.362853 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jrr5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2521a678-ad6c-464b-bf7b-c4f6237c2822\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://749ea865e46b19e2475b94b696bb2be8bc27add3f2ba2420c38b9d67b2b8ddd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc29c132abc0c440dd96bf14118de27e46aa41322e61ce3d685eb43962330cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jrr5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:11Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:11 crc kubenswrapper[4685]: I0321 03:48:11.374764 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aab80bc7857e9a3d86f540c725f4e35d0d86f979abed1ea5c9cddb4a6263924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d981cec0491154ace10ce4f437d1101c8dd69c34129a42b7dd5af226f8b14b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:11Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:11 crc kubenswrapper[4685]: I0321 03:48:11.391757 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6xvsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5cb18bc-bda7-463e-98fe-6d8ff293b949\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b3a945a5915df9df43b15e02b30e3ffa4b0f0dd4e9283d54f80b4adb56f368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d8efc9f2ad837ea5b63d39bb04316943c7562b645f8f78b12e54d430f2a4a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4d8efc9f2ad837ea5b63d39bb04316943c7562b645f8f78b12e54d430f2a4a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d2cd7978a2b2d5a571096a1f0172b76414dadc75fb54395e05d1f344f04e1a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d2cd7978a2b2d5a571096a1f0172b76414dadc75fb54395e05d1f344f04e1a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2188c1bcb129415bc4e9a76339bad4f973c3778a1c5a0735b15a8e0ccc17f5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2188c1bcb129415bc4e9a76339bad4f973c3778a1c5a0735b15a8e0ccc17f5ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1417d49afce3f61158d9158a6c9a55e186a1032fb3c4ad275c63b6c9df81dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1417d49afce3f61158d9158a6c9a55e186a1032fb3c4ad275c63b6c9df81dd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f6759044b25748d81e11825c925bc83b0caa70bc3bc5f930554ac6f8a19cabb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f6759044b25748d81e11825c925bc83b0caa70bc3bc5f930554ac6f8a19cabb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6eda6a197766562ad75d8ccf23a1e8998c02f7b6f2ac62fb7cc1ff8384b3d0cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6eda6a197766562ad75d8ccf23a1e8998c02f7b6f2ac62fb7cc1ff8384b3d0cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6xvsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:11Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:11 crc kubenswrapper[4685]: I0321 03:48:11.404289 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7jcm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd9b1743-6b69-46d3-a429-6f83bf43317a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a947393ce78ff4c9dddd93eb5ee7cb034673ebb6f0391cb868b473028074aa46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x266z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7jcm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:11Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:11 crc kubenswrapper[4685]: I0321 03:48:11.419938 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cea46fe2-4e41-43ab-a069-cb30fb4e732c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f8af9aaf0940271c159c00b04ffb7a1dd9471caa93ad4c7bd4507461a095923\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5ntg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://682dda84970818843427fd441cf0359877cff12a6a10a7f7fad13d60c5ca62f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5ntg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7r9cg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:11Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:12 crc kubenswrapper[4685]: I0321 03:48:12.192167 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cpfzk_08dfc393-0ddb-4bde-9b1f-2a48549f4549/ovnkube-controller/2.log" Mar 21 03:48:12 crc kubenswrapper[4685]: I0321 03:48:12.195928 4685 scope.go:117] "RemoveContainer" containerID="db2f167cea866dc3e14a9c0f6e343206b5e7d73d4c6971ddae722a4a29479cf1" Mar 21 03:48:12 crc kubenswrapper[4685]: E0321 03:48:12.196085 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-cpfzk_openshift-ovn-kubernetes(08dfc393-0ddb-4bde-9b1f-2a48549f4549)\"" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" podUID="08dfc393-0ddb-4bde-9b1f-2a48549f4549" Mar 21 03:48:12 crc kubenswrapper[4685]: I0321 03:48:12.217314 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fd9d618-b4ed-4942-b915-76dc59fb834a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba587c4fe2f05966282b50ba5236b9f3d9ef6de63f72c70ae9f7a5222cb8b904\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60c96edd458d05f217a2e9f07a44bd221303d821a790382a82cff0b912d48f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8583e9b5dff82d5df52e281ba4069e9259b1c8fe3d1b8121d0e9f3f9e97d47b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf0ba4af6f1f48fcb9ccf07dec53dc3ff1835a83dbf535bc48feb68fd646e78f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c896022c1e0726ccc5a54da9c432a0ff4082158ad76abcea60c0a1427c69134b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T03:47:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 03:47:19.136464 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 03:47:19.136735 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 03:47:19.138007 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2287411523/tls.crt::/tmp/serving-cert-2287411523/tls.key\\\\\\\"\\\\nI0321 03:47:19.320330 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 03:47:19.322862 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 03:47:19.322879 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 03:47:19.322902 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 03:47:19.322907 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 03:47:19.326922 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0321 03:47:19.326936 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 03:47:19.326989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 03:47:19.326999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 03:47:19.327009 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0321 03:47:19.327019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 03:47:19.327030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 03:47:19.327039 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 03:47:19.328267 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:18Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:48:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://182ad351b6c632ea64087d4784ea919d5c21165dc5d0373fa35db7f7f1eea435\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52df5756c4d3e8d1087a5a38b3e2b9966fb492d666c43773aa12a4cd83f6158d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52df5756c4d3e8d1087a5a38b3e2b9966fb492d666c43773aa12a4cd83f6158d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:46:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:46:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:46:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:12Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:12 crc kubenswrapper[4685]: I0321 03:48:12.233657 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cedbf5aa59177e67115d9dbf436805563022b6cebdd1c909d460d7997af5333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:12Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:12 crc kubenswrapper[4685]: I0321 03:48:12.247766 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df1f675def99930733ba3d2e467e3c07307a00cd8a10f0c3dee668de18f92dbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:12Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:12 crc kubenswrapper[4685]: I0321 03:48:12.260377 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:12Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:12 crc kubenswrapper[4685]: I0321 03:48:12.272043 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:12Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:12 crc kubenswrapper[4685]: I0321 03:48:12.281689 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-v9rdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fda9b1ff-e4a8-4d15-8f7b-2974991cd252\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrczq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrczq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-v9rdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:12Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:12 crc kubenswrapper[4685]: I0321 03:48:12.311131 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08dfc393-0ddb-4bde-9b1f-2a48549f4549\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ad07f9e8bc73e15364d59e6baf04c54550518537bac8cbfcf180b423a1da4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://beb9850d14f3c89154f1960550eed4073643c6b70832bb3958176ac924f5a9fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34752fc66f7ecfa5c0a9a761bc6a69d21ab1fa332ecd79948cb6b1ea34a8a8b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2215b819a38926d6f34fede86a32762a942f00bc552ca9c0b8df5f17ed47c10a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d89981c8d56dfb581289dc794c8402f5fde866fd6948c2c0a1940eb3ba99a320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://473ea0e8600e10ea2a695b213d69795c56c1a399cbc5dc82ddb34f7261a034e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db2f167cea866dc3e14a9c0f6e343206b5e7d73d4c6971ddae722a4a29479cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db2f167cea866dc3e14a9c0f6e343206b5e7d73d4c6971ddae722a4a29479cf1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T03:48:10Z\\\",\\\"message\\\":\\\".182158 6950 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0321 03:48:10.182201 6950 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0321 03:48:10.182262 6950 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0321 03:48:10.182349 6950 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0321 03:48:10.182419 6950 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0321 03:48:10.182472 6950 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0321 03:48:10.182494 6950 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0321 03:48:10.182550 6950 factory.go:656] Stopping watch factory\\\\nI0321 03:48:10.182584 6950 handler.go:208] Removed *v1.Node event handler 7\\\\nI0321 03:48:10.182225 6950 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0321 03:48:10.182774 6950 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0321 03:48:10.182795 6950 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0321 03:48:10.182807 6950 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0321 03:48:10.182867 6950 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0321 03:48:10.182891 6950 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T03:48:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-cpfzk_openshift-ovn-kubernetes(08dfc393-0ddb-4bde-9b1f-2a48549f4549)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86328f182f7db62c6133733a4a4b9171f479c5b134c11250bb1839234544f485\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cpfzk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:12Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:12 crc kubenswrapper[4685]: I0321 03:48:12.322330 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mlsb2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b79f01f-bf05-4f7d-b816-6ef01f21e949\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://415d35423a66454f5ed06ba8facb7adbda91fae370e4d1d8a022cf7002e7b8bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6tf6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mlsb2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:12Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:12 crc kubenswrapper[4685]: I0321 03:48:12.336069 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ztl6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dea34cf-d6c6-42fd-b4aa-8e175c6a78f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438ee4ce0e664eb23ee83b20eecbc81dd4f5ad12e50e98785be3696b58528952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sc2qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ztl6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:12Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:12 crc kubenswrapper[4685]: I0321 03:48:12.350577 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29e0ccb5-0299-4b2a-8138-694a1bb786db\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0be6102e44775f379b67814ed8f979bf81153bd68b519bc5c5e6cd2e3cb8169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://229992b297c1eb9aa6f92e56505bdf819e392fb777636759549854330bf022ab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T03:46:41Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0321 03:46:10.460102 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0321 03:46:10.462103 1 observer_polling.go:159] Starting file observer\\\\nI0321 03:46:10.493091 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0321 03:46:10.497028 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0321 03:46:40.990647 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T03:46:09Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa74801f196e57343a98011d671414abf1f1ebf4d7b962be522f0b6cad777acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41664fe1fb74ab3c38846eaf0a2b0bf46f5b0c4a2d2b3dfadb1a5e02e5a66e81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f2aa64984fe04680e1c00621e521e1a2a6eb5c2c2696c88fee33cc6eaa528f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:46:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:12Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:12 crc kubenswrapper[4685]: I0321 03:48:12.362171 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:12Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:12 crc kubenswrapper[4685]: I0321 03:48:12.375633 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jrr5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2521a678-ad6c-464b-bf7b-c4f6237c2822\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://749ea865e46b19e2475b94b696bb2be8bc27add3f2ba2420c38b9d67b2b8ddd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc29c132abc0c440dd96bf14118de27e46aa41322e61ce3d685eb43962330cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jrr5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:12Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:12 crc kubenswrapper[4685]: I0321 03:48:12.387530 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aab80bc7857e9a3d86f540c725f4e35d0d86f979abed1ea5c9cddb4a6263924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d981cec0491154ace10ce4f437d1101c8dd69c34129a42b7dd5af226f8b14b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:12Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:12 crc kubenswrapper[4685]: I0321 03:48:12.401258 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6xvsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5cb18bc-bda7-463e-98fe-6d8ff293b949\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b3a945a5915df9df43b15e02b30e3ffa4b0f0dd4e9283d54f80b4adb56f368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d8efc9f2ad837ea5b63d39bb04316943c7562b645f8f78b12e54d430f2a4a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4d8efc9f2ad837ea5b63d39bb04316943c7562b645f8f78b12e54d430f2a4a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d2cd7978a2b2d5a571096a1f0172b76414dadc75fb54395e05d1f344f04e1a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d2cd7978a2b2d5a571096a1f0172b76414dadc75fb54395e05d1f344f04e1a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2188c1bcb129415bc4e9a76339bad4f973c3778a1c5a0735b15a8e0ccc17f5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2188c1bcb129415bc4e9a76339bad4f973c3778a1c5a0735b15a8e0ccc17f5ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1417d49afce3f61158d9158a6c9a55e186a1032fb3c4ad275c63b6c9df81dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1417d49afce3f61158d9158a6c9a55e186a1032fb3c4ad275c63b6c9df81dd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f6759044b25748d81e11825c925bc83b0caa70bc3bc5f930554ac6f8a19cabb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f6759044b25748d81e11825c925bc83b0caa70bc3bc5f930554ac6f8a19cabb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6eda6a197766562ad75d8ccf23a1e8998c02f7b6f2ac62fb7cc1ff8384b3d0cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6eda6a197766562ad75d8ccf23a1e8998c02f7b6f2ac62fb7cc1ff8384b3d0cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6xvsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:12Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:12 crc kubenswrapper[4685]: I0321 03:48:12.415850 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7jcm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd9b1743-6b69-46d3-a429-6f83bf43317a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a947393ce78ff4c9dddd93eb5ee7cb034673ebb6f0391cb868b473028074aa46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x266z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7jcm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:12Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:12 crc kubenswrapper[4685]: I0321 03:48:12.425830 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cea46fe2-4e41-43ab-a069-cb30fb4e732c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f8af9aaf0940271c159c00b04ffb7a1dd9471caa93ad4c7bd4507461a095923\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5ntg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://682dda84970818843427fd441cf0359877cff12a6a10a7f7fad13d60c5ca62f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5ntg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7r9cg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:12Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:13 crc kubenswrapper[4685]: I0321 03:48:13.117451 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 03:48:13 crc kubenswrapper[4685]: I0321 03:48:13.135042 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:13Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:13 crc kubenswrapper[4685]: I0321 03:48:13.146098 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-v9rdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fda9b1ff-e4a8-4d15-8f7b-2974991cd252\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrczq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrczq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-v9rdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:13Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:13 crc kubenswrapper[4685]: I0321 03:48:13.162497 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08dfc393-0ddb-4bde-9b1f-2a48549f4549\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ad07f9e8bc73e15364d59e6baf04c54550518537bac8cbfcf180b423a1da4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://beb9850d14f3c89154f1960550eed4073643c6b70832bb3958176ac924f5a9fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34752fc66f7ecfa5c0a9a761bc6a69d21ab1fa332ecd79948cb6b1ea34a8a8b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2215b819a38926d6f34fede86a32762a942f00bc552ca9c0b8df5f17ed47c10a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d89981c8d56dfb581289dc794c8402f5fde866fd6948c2c0a1940eb3ba99a320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://473ea0e8600e10ea2a695b213d69795c56c1a399cbc5dc82ddb34f7261a034e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db2f167cea866dc3e14a9c0f6e343206b5e7d73d4c6971ddae722a4a29479cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db2f167cea866dc3e14a9c0f6e343206b5e7d73d4c6971ddae722a4a29479cf1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T03:48:10Z\\\",\\\"message\\\":\\\".182158 6950 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0321 03:48:10.182201 6950 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0321 03:48:10.182262 6950 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0321 03:48:10.182349 6950 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0321 03:48:10.182419 6950 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0321 03:48:10.182472 6950 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0321 03:48:10.182494 6950 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0321 03:48:10.182550 6950 factory.go:656] Stopping watch factory\\\\nI0321 03:48:10.182584 6950 handler.go:208] Removed *v1.Node event handler 7\\\\nI0321 03:48:10.182225 6950 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0321 03:48:10.182774 6950 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0321 03:48:10.182795 6950 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0321 03:48:10.182807 6950 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0321 03:48:10.182867 6950 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0321 03:48:10.182891 6950 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T03:48:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-cpfzk_openshift-ovn-kubernetes(08dfc393-0ddb-4bde-9b1f-2a48549f4549)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86328f182f7db62c6133733a4a4b9171f479c5b134c11250bb1839234544f485\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cpfzk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:13Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:13 crc kubenswrapper[4685]: I0321 03:48:13.172121 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mlsb2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b79f01f-bf05-4f7d-b816-6ef01f21e949\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://415d35423a66454f5ed06ba8facb7adbda91fae370e4d1d8a022cf7002e7b8bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6tf6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mlsb2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:13Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:13 crc kubenswrapper[4685]: I0321 03:48:13.183163 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df1f675def99930733ba3d2e467e3c07307a00cd8a10f0c3dee668de18f92dbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:13Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:13 crc kubenswrapper[4685]: I0321 03:48:13.193692 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:13Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:13 crc kubenswrapper[4685]: I0321 03:48:13.205120 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ztl6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dea34cf-d6c6-42fd-b4aa-8e175c6a78f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438ee4ce0e664eb23ee83b20eecbc81dd4f5ad12e50e98785be3696b58528952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sc2qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ztl6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:13Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:13 crc kubenswrapper[4685]: I0321 03:48:13.217300 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29e0ccb5-0299-4b2a-8138-694a1bb786db\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0be6102e44775f379b67814ed8f979bf81153bd68b519bc5c5e6cd2e3cb8169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://229992b297c1eb9aa6f92e56505bdf819e392fb777636759549854330bf022ab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T03:46:41Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0321 03:46:10.460102 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0321 03:46:10.462103 1 observer_polling.go:159] Starting file observer\\\\nI0321 03:46:10.493091 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0321 03:46:10.497028 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0321 03:46:40.990647 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T03:46:09Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa74801f196e57343a98011d671414abf1f1ebf4d7b962be522f0b6cad777acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41664fe1fb74ab3c38846eaf0a2b0bf46f5b0c4a2d2b3dfadb1a5e02e5a66e81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f2aa64984fe04680e1c00621e521e1a2a6eb5c2c2696c88fee33cc6eaa528f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:46:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:13Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:13 crc kubenswrapper[4685]: I0321 03:48:13.227579 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:13Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:13 crc kubenswrapper[4685]: I0321 03:48:13.237506 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jrr5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2521a678-ad6c-464b-bf7b-c4f6237c2822\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://749ea865e46b19e2475b94b696bb2be8bc27add3f2ba2420c38b9d67b2b8ddd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc29c132abc0c440dd96bf14118de27e46aa41322e61ce3d685eb43962330cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jrr5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:13Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:13 crc kubenswrapper[4685]: I0321 03:48:13.250524 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aab80bc7857e9a3d86f540c725f4e35d0d86f979abed1ea5c9cddb4a6263924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d981cec0491154ace10ce4f437d1101c8dd69c34129a42b7dd5af226f8b14b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:13Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:13 crc kubenswrapper[4685]: I0321 03:48:13.266259 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6xvsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5cb18bc-bda7-463e-98fe-6d8ff293b949\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b3a945a5915df9df43b15e02b30e3ffa4b0f0dd4e9283d54f80b4adb56f368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d8efc9f2ad837ea5b63d39bb04316943c7562b645f8f78b12e54d430f2a4a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4d8efc9f2ad837ea5b63d39bb04316943c7562b645f8f78b12e54d430f2a4a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d2cd7978a2b2d5a571096a1f0172b76414dadc75fb54395e05d1f344f04e1a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d2cd7978a2b2d5a571096a1f0172b76414dadc75fb54395e05d1f344f04e1a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2188c1bcb129415bc4e9a76339bad4f973c3778a1c5a0735b15a8e0ccc17f5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2188c1bcb129415bc4e9a76339bad4f973c3778a1c5a0735b15a8e0ccc17f5ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1417d49afce3f61158d9158a6c9a55e186a1032fb3c4ad275c63b6c9df81dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1417d49afce3f61158d9158a6c9a55e186a1032fb3c4ad275c63b6c9df81dd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f6759044b25748d81e11825c925bc83b0caa70bc3bc5f930554ac6f8a19cabb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f6759044b25748d81e11825c925bc83b0caa70bc3bc5f930554ac6f8a19cabb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6eda6a197766562ad75d8ccf23a1e8998c02f7b6f2ac62fb7cc1ff8384b3d0cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6eda6a197766562ad75d8ccf23a1e8998c02f7b6f2ac62fb7cc1ff8384b3d0cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6xvsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:13Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:13 crc kubenswrapper[4685]: I0321 03:48:13.283664 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7jcm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd9b1743-6b69-46d3-a429-6f83bf43317a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a947393ce78ff4c9dddd93eb5ee7cb034673ebb6f0391cb868b473028074aa46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x266z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7jcm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:13Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:13 crc kubenswrapper[4685]: I0321 03:48:13.294387 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cea46fe2-4e41-43ab-a069-cb30fb4e732c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f8af9aaf0940271c159c00b04ffb7a1dd9471caa93ad4c7bd4507461a095923\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5ntg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://682dda84970818843427fd441cf0359877cff12a6a10a7f7fad13d60c5ca62f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5ntg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7r9cg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:13Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:13 crc kubenswrapper[4685]: I0321 03:48:13.300719 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 03:48:13 crc kubenswrapper[4685]: E0321 03:48:13.301062 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 03:48:13 crc kubenswrapper[4685]: I0321 03:48:13.300727 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v9rdl" Mar 21 03:48:13 crc kubenswrapper[4685]: E0321 03:48:13.301266 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v9rdl" podUID="fda9b1ff-e4a8-4d15-8f7b-2974991cd252" Mar 21 03:48:13 crc kubenswrapper[4685]: I0321 03:48:13.300727 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 03:48:13 crc kubenswrapper[4685]: E0321 03:48:13.301483 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 03:48:13 crc kubenswrapper[4685]: I0321 03:48:13.300867 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 03:48:13 crc kubenswrapper[4685]: E0321 03:48:13.301700 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 03:48:13 crc kubenswrapper[4685]: I0321 03:48:13.309143 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fd9d618-b4ed-4942-b915-76dc59fb834a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba587c4fe2f05966282b50ba5236b9f3d9ef6de63f72c70ae9f7a5222cb8b904\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60c96edd458d05f217a2e9f07a44bd221303d821a790382a82cff0b912d48f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8583e9b5dff82d5df52e281ba4069e9259b1c8fe3d1b8121d0e9f3f9e97d47b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf0ba4af6f1f48fcb9ccf07dec53dc3ff1835a83dbf535bc48feb68fd646e78f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c896022c1e0726ccc5a54da9c432a0ff4082158ad76abcea60c0a1427c69134b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T03:47:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 03:47:19.136464 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 03:47:19.136735 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 03:47:19.138007 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2287411523/tls.crt::/tmp/serving-cert-2287411523/tls.key\\\\\\\"\\\\nI0321 03:47:19.320330 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 03:47:19.322862 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 03:47:19.322879 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 03:47:19.322902 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 03:47:19.322907 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 03:47:19.326922 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0321 03:47:19.326936 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 03:47:19.326989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 03:47:19.326999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 03:47:19.327009 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0321 03:47:19.327019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 03:47:19.327030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 03:47:19.327039 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 03:47:19.328267 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:18Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:48:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://182ad351b6c632ea64087d4784ea919d5c21165dc5d0373fa35db7f7f1eea435\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52df5756c4d3e8d1087a5a38b3e2b9966fb492d666c43773aa12a4cd83f6158d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52df5756c4d3e8d1087a5a38b3e2b9966fb492d666c43773aa12a4cd83f6158d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:46:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:46:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:46:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:13Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:13 crc kubenswrapper[4685]: I0321 03:48:13.319124 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cedbf5aa59177e67115d9dbf436805563022b6cebdd1c909d460d7997af5333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:13Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:13 crc kubenswrapper[4685]: E0321 03:48:13.406778 4685 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 03:48:15 crc kubenswrapper[4685]: I0321 03:48:15.299927 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 03:48:15 crc kubenswrapper[4685]: I0321 03:48:15.299985 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 03:48:15 crc kubenswrapper[4685]: I0321 03:48:15.300000 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 03:48:15 crc kubenswrapper[4685]: I0321 03:48:15.299943 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v9rdl" Mar 21 03:48:15 crc kubenswrapper[4685]: E0321 03:48:15.300132 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 03:48:15 crc kubenswrapper[4685]: E0321 03:48:15.300302 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v9rdl" podUID="fda9b1ff-e4a8-4d15-8f7b-2974991cd252" Mar 21 03:48:15 crc kubenswrapper[4685]: E0321 03:48:15.300401 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 03:48:15 crc kubenswrapper[4685]: E0321 03:48:15.300486 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 03:48:17 crc kubenswrapper[4685]: I0321 03:48:17.300654 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 03:48:17 crc kubenswrapper[4685]: I0321 03:48:17.300702 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 03:48:17 crc kubenswrapper[4685]: I0321 03:48:17.300729 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 03:48:17 crc kubenswrapper[4685]: I0321 03:48:17.300779 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v9rdl" Mar 21 03:48:17 crc kubenswrapper[4685]: E0321 03:48:17.300942 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 03:48:17 crc kubenswrapper[4685]: E0321 03:48:17.301157 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 03:48:17 crc kubenswrapper[4685]: E0321 03:48:17.301290 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 03:48:17 crc kubenswrapper[4685]: E0321 03:48:17.301438 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v9rdl" podUID="fda9b1ff-e4a8-4d15-8f7b-2974991cd252" Mar 21 03:48:18 crc kubenswrapper[4685]: I0321 03:48:18.314440 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 21 03:48:18 crc kubenswrapper[4685]: I0321 03:48:18.329013 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29e0ccb5-0299-4b2a-8138-694a1bb786db\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0be6102e44775f379b67814ed8f979bf81153bd68b519bc5c5e6cd2e3cb8169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://229992b297c1eb9aa6f92e56505bdf819e392fb777636759549854330bf022ab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T03:46:41Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0321 03:46:10.460102 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0321 03:46:10.462103 1 observer_polling.go:159] Starting file observer\\\\nI0321 03:46:10.493091 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0321 03:46:10.497028 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0321 03:46:40.990647 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T03:46:09Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa74801f196e57343a98011d671414abf1f1ebf4d7b962be522f0b6cad777acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41664fe1fb74ab3c38846eaf0a2b0bf46f5b0c4a2d2b3dfadb1a5e02e5a66e81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f2aa64984fe04680e1c00621e521e1a2a6eb5c2c2696c88fee33cc6eaa528f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:46:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:18Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:18 crc kubenswrapper[4685]: I0321 03:48:18.342515 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:18Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:18 crc kubenswrapper[4685]: I0321 03:48:18.356148 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jrr5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2521a678-ad6c-464b-bf7b-c4f6237c2822\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://749ea865e46b19e2475b94b696bb2be8bc27add3f2ba2420c38b9d67b2b8ddd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc29c132abc0c440dd96bf14118de27e46aa41322e61ce3d685eb43962330cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jrr5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:18Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:18 crc kubenswrapper[4685]: I0321 03:48:18.370914 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aab80bc7857e9a3d86f540c725f4e35d0d86f979abed1ea5c9cddb4a6263924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d981cec0491154ace10ce4f437d1101c8dd69c34129a42b7dd5af226f8b14b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:18Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:18 crc kubenswrapper[4685]: I0321 03:48:18.392318 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6xvsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5cb18bc-bda7-463e-98fe-6d8ff293b949\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b3a945a5915df9df43b15e02b30e3ffa4b0f0dd4e9283d54f80b4adb56f368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d8efc9f2ad837ea5b63d39bb04316943c7562b645f8f78b12e54d430f2a4a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4d8efc9f2ad837ea5b63d39bb04316943c7562b645f8f78b12e54d430f2a4a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d2cd7978a2b2d5a571096a1f0172b76414dadc75fb54395e05d1f344f04e1a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d2cd7978a2b2d5a571096a1f0172b76414dadc75fb54395e05d1f344f04e1a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2188c1bcb129415bc4e9a76339bad4f973c3778a1c5a0735b15a8e0ccc17f5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2188c1bcb129415bc4e9a76339bad4f973c3778a1c5a0735b15a8e0ccc17f5ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1417d49afce3f61158d9158a6c9a55e186a1032fb3c4ad275c63b6c9df81dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1417d49afce3f61158d9158a6c9a55e186a1032fb3c4ad275c63b6c9df81dd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f6759044b25748d81e11825c925bc83b0caa70bc3bc5f930554ac6f8a19cabb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f6759044b25748d81e11825c925bc83b0caa70bc3bc5f930554ac6f8a19cabb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6eda6a197766562ad75d8ccf23a1e8998c02f7b6f2ac62fb7cc1ff8384b3d0cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6eda6a197766562ad75d8ccf23a1e8998c02f7b6f2ac62fb7cc1ff8384b3d0cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6xvsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:18Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:18 crc kubenswrapper[4685]: E0321 03:48:18.409643 4685 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 03:48:18 crc kubenswrapper[4685]: I0321 03:48:18.411656 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7jcm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd9b1743-6b69-46d3-a429-6f83bf43317a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a947393ce78ff4c9dddd93eb5ee7cb034673ebb6f0391cb868b473028074aa46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x266z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7jcm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:18Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:18 crc kubenswrapper[4685]: I0321 03:48:18.429737 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cea46fe2-4e41-43ab-a069-cb30fb4e732c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f8af9aaf0940271c159c00b04ffb7a1dd9471caa93ad4c7bd4507461a095923\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5ntg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://682dda84970818843427fd441cf0359877cff12a6a10a7f7fad13d60c5ca62f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5ntg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7r9cg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:18Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:18 crc kubenswrapper[4685]: I0321 03:48:18.447765 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fd9d618-b4ed-4942-b915-76dc59fb834a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba587c4fe2f05966282b50ba5236b9f3d9ef6de63f72c70ae9f7a5222cb8b904\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60c96edd458d05f217a2e9f07a44bd221303d821a790382a82cff0b912d48f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8583e9b5dff82d5df52e281ba4069e9259b1c8fe3d1b8121d0e9f3f9e97d47b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf0ba4af6f1f48fcb9ccf07dec53dc3ff1835a83dbf535bc48feb68fd646e78f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c896022c1e0726ccc5a54da9c432a0ff4082158ad76abcea60c0a1427c69134b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T03:47:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 03:47:19.136464 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 03:47:19.136735 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 03:47:19.138007 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2287411523/tls.crt::/tmp/serving-cert-2287411523/tls.key\\\\\\\"\\\\nI0321 03:47:19.320330 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 03:47:19.322862 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 03:47:19.322879 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 03:47:19.322902 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 03:47:19.322907 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 03:47:19.326922 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0321 03:47:19.326936 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 03:47:19.326989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 03:47:19.326999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 03:47:19.327009 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0321 03:47:19.327019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 03:47:19.327030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 03:47:19.327039 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 03:47:19.328267 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:18Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:48:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://182ad351b6c632ea64087d4784ea919d5c21165dc5d0373fa35db7f7f1eea435\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52df5756c4d3e8d1087a5a38b3e2b9966fb492d666c43773aa12a4cd83f6158d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52df5756c4d3e8d1087a5a38b3e2b9966fb492d666c43773aa12a4cd83f6158d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:46:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:46:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:46:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:18Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:18 crc kubenswrapper[4685]: I0321 03:48:18.461707 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cedbf5aa59177e67115d9dbf436805563022b6cebdd1c909d460d7997af5333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:18Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:18 crc kubenswrapper[4685]: I0321 03:48:18.477266 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:18Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:18 crc kubenswrapper[4685]: I0321 03:48:18.491780 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-v9rdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fda9b1ff-e4a8-4d15-8f7b-2974991cd252\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrczq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrczq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-v9rdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:18Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:18 crc kubenswrapper[4685]: I0321 03:48:18.514696 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08dfc393-0ddb-4bde-9b1f-2a48549f4549\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ad07f9e8bc73e15364d59e6baf04c54550518537bac8cbfcf180b423a1da4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://beb9850d14f3c89154f1960550eed4073643c6b70832bb3958176ac924f5a9fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34752fc66f7ecfa5c0a9a761bc6a69d21ab1fa332ecd79948cb6b1ea34a8a8b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2215b819a38926d6f34fede86a32762a942f00bc552ca9c0b8df5f17ed47c10a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d89981c8d56dfb581289dc794c8402f5fde866fd6948c2c0a1940eb3ba99a320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://473ea0e8600e10ea2a695b213d69795c56c1a399cbc5dc82ddb34f7261a034e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db2f167cea866dc3e14a9c0f6e343206b5e7d73d4c6971ddae722a4a29479cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db2f167cea866dc3e14a9c0f6e343206b5e7d73d4c6971ddae722a4a29479cf1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T03:48:10Z\\\",\\\"message\\\":\\\".182158 6950 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0321 03:48:10.182201 6950 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0321 03:48:10.182262 6950 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0321 03:48:10.182349 6950 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0321 03:48:10.182419 6950 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0321 03:48:10.182472 6950 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0321 03:48:10.182494 6950 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0321 03:48:10.182550 6950 factory.go:656] Stopping watch factory\\\\nI0321 03:48:10.182584 6950 handler.go:208] Removed *v1.Node event handler 7\\\\nI0321 03:48:10.182225 6950 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0321 03:48:10.182774 6950 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0321 03:48:10.182795 6950 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0321 03:48:10.182807 6950 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0321 03:48:10.182867 6950 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0321 03:48:10.182891 6950 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T03:48:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-cpfzk_openshift-ovn-kubernetes(08dfc393-0ddb-4bde-9b1f-2a48549f4549)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86328f182f7db62c6133733a4a4b9171f479c5b134c11250bb1839234544f485\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cpfzk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:18Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:18 crc kubenswrapper[4685]: I0321 03:48:18.530139 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mlsb2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b79f01f-bf05-4f7d-b816-6ef01f21e949\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://415d35423a66454f5ed06ba8facb7adbda91fae370e4d1d8a022cf7002e7b8bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6tf6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mlsb2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:18Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:18 crc kubenswrapper[4685]: I0321 03:48:18.546200 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df1f675def99930733ba3d2e467e3c07307a00cd8a10f0c3dee668de18f92dbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:18Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:18 crc kubenswrapper[4685]: I0321 03:48:18.562281 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:18Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:18 crc kubenswrapper[4685]: I0321 03:48:18.574254 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ztl6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dea34cf-d6c6-42fd-b4aa-8e175c6a78f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438ee4ce0e664eb23ee83b20eecbc81dd4f5ad12e50e98785be3696b58528952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sc2qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ztl6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:18Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:19 crc kubenswrapper[4685]: I0321 03:48:19.300052 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v9rdl" Mar 21 03:48:19 crc kubenswrapper[4685]: I0321 03:48:19.300046 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 03:48:19 crc kubenswrapper[4685]: E0321 03:48:19.300720 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v9rdl" podUID="fda9b1ff-e4a8-4d15-8f7b-2974991cd252" Mar 21 03:48:19 crc kubenswrapper[4685]: I0321 03:48:19.300094 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 03:48:19 crc kubenswrapper[4685]: I0321 03:48:19.300107 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 03:48:19 crc kubenswrapper[4685]: E0321 03:48:19.300934 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 03:48:19 crc kubenswrapper[4685]: E0321 03:48:19.300997 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 03:48:19 crc kubenswrapper[4685]: E0321 03:48:19.301076 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 03:48:20 crc kubenswrapper[4685]: I0321 03:48:20.897719 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:20 crc kubenswrapper[4685]: I0321 03:48:20.897758 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:20 crc kubenswrapper[4685]: I0321 03:48:20.897768 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:20 crc kubenswrapper[4685]: I0321 03:48:20.897784 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:20 crc kubenswrapper[4685]: I0321 03:48:20.897794 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:20Z","lastTransitionTime":"2026-03-21T03:48:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:20 crc kubenswrapper[4685]: E0321 03:48:20.913433 4685 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8bd8455e-cd7f-4a01-9ab2-39696fd22c82\\\",\\\"systemUUID\\\":\\\"9e08e2bc-a5ba-4ecf-a5ec-5e30e6b4a1fa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:20Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:20 crc kubenswrapper[4685]: I0321 03:48:20.918138 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:20 crc kubenswrapper[4685]: I0321 03:48:20.918168 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:20 crc kubenswrapper[4685]: I0321 03:48:20.918180 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:20 crc kubenswrapper[4685]: I0321 03:48:20.918197 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:20 crc kubenswrapper[4685]: I0321 03:48:20.918213 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:20Z","lastTransitionTime":"2026-03-21T03:48:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:20 crc kubenswrapper[4685]: E0321 03:48:20.931468 4685 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8bd8455e-cd7f-4a01-9ab2-39696fd22c82\\\",\\\"systemUUID\\\":\\\"9e08e2bc-a5ba-4ecf-a5ec-5e30e6b4a1fa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:20Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:20 crc kubenswrapper[4685]: I0321 03:48:20.935439 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:20 crc kubenswrapper[4685]: I0321 03:48:20.935469 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:20 crc kubenswrapper[4685]: I0321 03:48:20.935478 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:20 crc kubenswrapper[4685]: I0321 03:48:20.935495 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:20 crc kubenswrapper[4685]: I0321 03:48:20.935505 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:20Z","lastTransitionTime":"2026-03-21T03:48:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:20 crc kubenswrapper[4685]: E0321 03:48:20.948339 4685 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8bd8455e-cd7f-4a01-9ab2-39696fd22c82\\\",\\\"systemUUID\\\":\\\"9e08e2bc-a5ba-4ecf-a5ec-5e30e6b4a1fa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:20Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:20 crc kubenswrapper[4685]: I0321 03:48:20.953054 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:20 crc kubenswrapper[4685]: I0321 03:48:20.953110 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:20 crc kubenswrapper[4685]: I0321 03:48:20.953126 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:20 crc kubenswrapper[4685]: I0321 03:48:20.953149 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:20 crc kubenswrapper[4685]: I0321 03:48:20.953165 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:20Z","lastTransitionTime":"2026-03-21T03:48:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:20 crc kubenswrapper[4685]: E0321 03:48:20.972550 4685 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8bd8455e-cd7f-4a01-9ab2-39696fd22c82\\\",\\\"systemUUID\\\":\\\"9e08e2bc-a5ba-4ecf-a5ec-5e30e6b4a1fa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:20Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:20 crc kubenswrapper[4685]: I0321 03:48:20.976934 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:20 crc kubenswrapper[4685]: I0321 03:48:20.976965 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:20 crc kubenswrapper[4685]: I0321 03:48:20.976974 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:20 crc kubenswrapper[4685]: I0321 03:48:20.976990 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:20 crc kubenswrapper[4685]: I0321 03:48:20.977001 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:20Z","lastTransitionTime":"2026-03-21T03:48:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:20 crc kubenswrapper[4685]: E0321 03:48:20.989034 4685 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8bd8455e-cd7f-4a01-9ab2-39696fd22c82\\\",\\\"systemUUID\\\":\\\"9e08e2bc-a5ba-4ecf-a5ec-5e30e6b4a1fa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:20Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:20 crc kubenswrapper[4685]: E0321 03:48:20.989159 4685 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 21 03:48:21 crc kubenswrapper[4685]: I0321 03:48:21.300522 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 03:48:21 crc kubenswrapper[4685]: I0321 03:48:21.300600 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 03:48:21 crc kubenswrapper[4685]: I0321 03:48:21.300547 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 03:48:21 crc kubenswrapper[4685]: I0321 03:48:21.300530 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v9rdl" Mar 21 03:48:21 crc kubenswrapper[4685]: E0321 03:48:21.300782 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 03:48:21 crc kubenswrapper[4685]: E0321 03:48:21.300900 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 03:48:21 crc kubenswrapper[4685]: E0321 03:48:21.301131 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v9rdl" podUID="fda9b1ff-e4a8-4d15-8f7b-2974991cd252" Mar 21 03:48:21 crc kubenswrapper[4685]: E0321 03:48:21.301254 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 03:48:23 crc kubenswrapper[4685]: I0321 03:48:23.300602 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 03:48:23 crc kubenswrapper[4685]: I0321 03:48:23.300650 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 03:48:23 crc kubenswrapper[4685]: I0321 03:48:23.300708 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 03:48:23 crc kubenswrapper[4685]: I0321 03:48:23.300647 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v9rdl" Mar 21 03:48:23 crc kubenswrapper[4685]: E0321 03:48:23.300803 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 03:48:23 crc kubenswrapper[4685]: E0321 03:48:23.301006 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v9rdl" podUID="fda9b1ff-e4a8-4d15-8f7b-2974991cd252" Mar 21 03:48:23 crc kubenswrapper[4685]: E0321 03:48:23.301143 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 03:48:23 crc kubenswrapper[4685]: E0321 03:48:23.301245 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 03:48:23 crc kubenswrapper[4685]: E0321 03:48:23.411338 4685 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 03:48:24 crc kubenswrapper[4685]: I0321 03:48:24.319152 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 21 03:48:25 crc kubenswrapper[4685]: I0321 03:48:25.300618 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v9rdl" Mar 21 03:48:25 crc kubenswrapper[4685]: I0321 03:48:25.300672 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 03:48:25 crc kubenswrapper[4685]: I0321 03:48:25.300745 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 03:48:25 crc kubenswrapper[4685]: I0321 03:48:25.300618 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 03:48:25 crc kubenswrapper[4685]: E0321 03:48:25.300792 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v9rdl" podUID="fda9b1ff-e4a8-4d15-8f7b-2974991cd252" Mar 21 03:48:25 crc kubenswrapper[4685]: E0321 03:48:25.300938 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 03:48:25 crc kubenswrapper[4685]: E0321 03:48:25.301157 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 03:48:25 crc kubenswrapper[4685]: E0321 03:48:25.301207 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 03:48:27 crc kubenswrapper[4685]: I0321 03:48:27.300035 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 03:48:27 crc kubenswrapper[4685]: I0321 03:48:27.300125 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 03:48:27 crc kubenswrapper[4685]: I0321 03:48:27.300170 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v9rdl" Mar 21 03:48:27 crc kubenswrapper[4685]: I0321 03:48:27.300098 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 03:48:27 crc kubenswrapper[4685]: E0321 03:48:27.300329 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 03:48:27 crc kubenswrapper[4685]: E0321 03:48:27.300700 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 03:48:27 crc kubenswrapper[4685]: E0321 03:48:27.300787 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v9rdl" podUID="fda9b1ff-e4a8-4d15-8f7b-2974991cd252" Mar 21 03:48:27 crc kubenswrapper[4685]: E0321 03:48:27.300938 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 03:48:27 crc kubenswrapper[4685]: I0321 03:48:27.301157 4685 scope.go:117] "RemoveContainer" containerID="db2f167cea866dc3e14a9c0f6e343206b5e7d73d4c6971ddae722a4a29479cf1" Mar 21 03:48:27 crc kubenswrapper[4685]: E0321 03:48:27.301393 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-cpfzk_openshift-ovn-kubernetes(08dfc393-0ddb-4bde-9b1f-2a48549f4549)\"" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" podUID="08dfc393-0ddb-4bde-9b1f-2a48549f4549" Mar 21 03:48:28 crc kubenswrapper[4685]: I0321 03:48:28.317632 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74d074e3-02ec-4391-b150-07bea56db3c9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a85a4c157d29725897cd0fd271fa9844551cbb95dbc21e085684115987d8e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5af33f2ffea677fec9bcbaa1de7545651b1021c12e6ea778649d9b6b160b6b8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5af33f2ffea677fec9bcbaa1de7545651b1021c12e6ea778649d9b6b160b6b8e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:46:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:46:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:46:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:28Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:28 crc kubenswrapper[4685]: I0321 03:48:28.336825 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df1f675def99930733ba3d2e467e3c07307a00cd8a10f0c3dee668de18f92dbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:28Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:28 crc kubenswrapper[4685]: I0321 03:48:28.357777 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:28Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:28 crc kubenswrapper[4685]: I0321 03:48:28.369369 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:28Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:28 crc kubenswrapper[4685]: I0321 03:48:28.380240 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-v9rdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fda9b1ff-e4a8-4d15-8f7b-2974991cd252\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrczq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrczq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-v9rdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:28Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:28 crc kubenswrapper[4685]: I0321 03:48:28.403669 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08dfc393-0ddb-4bde-9b1f-2a48549f4549\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ad07f9e8bc73e15364d59e6baf04c54550518537bac8cbfcf180b423a1da4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://beb9850d14f3c89154f1960550eed4073643c6b70832bb3958176ac924f5a9fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34752fc66f7ecfa5c0a9a761bc6a69d21ab1fa332ecd79948cb6b1ea34a8a8b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2215b819a38926d6f34fede86a32762a942f00bc552ca9c0b8df5f17ed47c10a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d89981c8d56dfb581289dc794c8402f5fde866fd6948c2c0a1940eb3ba99a320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://473ea0e8600e10ea2a695b213d69795c56c1a399cbc5dc82ddb34f7261a034e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db2f167cea866dc3e14a9c0f6e343206b5e7d73d4c6971ddae722a4a29479cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db2f167cea866dc3e14a9c0f6e343206b5e7d73d4c6971ddae722a4a29479cf1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T03:48:10Z\\\",\\\"message\\\":\\\".182158 6950 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0321 03:48:10.182201 6950 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0321 03:48:10.182262 6950 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0321 03:48:10.182349 6950 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0321 03:48:10.182419 6950 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0321 03:48:10.182472 6950 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0321 03:48:10.182494 6950 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0321 03:48:10.182550 6950 factory.go:656] Stopping watch factory\\\\nI0321 03:48:10.182584 6950 handler.go:208] Removed *v1.Node event handler 7\\\\nI0321 03:48:10.182225 6950 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0321 03:48:10.182774 6950 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0321 03:48:10.182795 6950 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0321 03:48:10.182807 6950 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0321 03:48:10.182867 6950 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0321 03:48:10.182891 6950 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T03:48:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-cpfzk_openshift-ovn-kubernetes(08dfc393-0ddb-4bde-9b1f-2a48549f4549)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86328f182f7db62c6133733a4a4b9171f479c5b134c11250bb1839234544f485\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cpfzk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:28Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:28 crc kubenswrapper[4685]: E0321 03:48:28.411954 4685 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 03:48:28 crc kubenswrapper[4685]: I0321 03:48:28.416956 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mlsb2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b79f01f-bf05-4f7d-b816-6ef01f21e949\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://415d35423a66454f5ed06ba8facb7adbda91fae370e4d1d8a022cf7002e7b8bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6tf6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mlsb2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:28Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:28 crc kubenswrapper[4685]: I0321 03:48:28.435443 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"591a6428-2384-44dd-826d-f6d2cf76c794\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b6e897d496eb8cc32a4f3c51a65335a9594f50c7010a0f022f54722edfd38e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fe44d24bf93a4fdf116cd039951b919f771648e1a8c53df1507569b821dc8aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://405513730b33c9e9981a2c99d2e2c5897042c1a395d9fc099ad6818a6352cb2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33920b49011815c1b030d555584aee220e8fede3b4f9ff5f8a5f554d6e1d8d02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33920b49011815c1b030d555584aee220e8fede3b4f9ff5f8a5f554d6e1d8d02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:46:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:46:09Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:46:08Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:28Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:28 crc kubenswrapper[4685]: I0321 03:48:28.444738 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ztl6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dea34cf-d6c6-42fd-b4aa-8e175c6a78f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438ee4ce0e664eb23ee83b20eecbc81dd4f5ad12e50e98785be3696b58528952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sc2qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ztl6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:28Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:28 crc kubenswrapper[4685]: I0321 03:48:28.458343 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:28Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:28 crc kubenswrapper[4685]: I0321 03:48:28.470417 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jrr5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2521a678-ad6c-464b-bf7b-c4f6237c2822\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://749ea865e46b19e2475b94b696bb2be8bc27add3f2ba2420c38b9d67b2b8ddd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc29c132abc0c440dd96bf14118de27e46aa41322e61ce3d685eb43962330cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jrr5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:28Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:28 crc kubenswrapper[4685]: I0321 03:48:28.482016 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29e0ccb5-0299-4b2a-8138-694a1bb786db\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0be6102e44775f379b67814ed8f979bf81153bd68b519bc5c5e6cd2e3cb8169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://229992b297c1eb9aa6f92e56505bdf819e392fb777636759549854330bf022ab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T03:46:41Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0321 03:46:10.460102 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0321 03:46:10.462103 1 observer_polling.go:159] Starting file observer\\\\nI0321 03:46:10.493091 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0321 03:46:10.497028 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0321 03:46:40.990647 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T03:46:09Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa74801f196e57343a98011d671414abf1f1ebf4d7b962be522f0b6cad777acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41664fe1fb74ab3c38846eaf0a2b0bf46f5b0c4a2d2b3dfadb1a5e02e5a66e81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f2aa64984fe04680e1c00621e521e1a2a6eb5c2c2696c88fee33cc6eaa528f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:46:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:28Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:28 crc kubenswrapper[4685]: I0321 03:48:28.495780 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6xvsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5cb18bc-bda7-463e-98fe-6d8ff293b949\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b3a945a5915df9df43b15e02b30e3ffa4b0f0dd4e9283d54f80b4adb56f368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d8efc9f2ad837ea5b63d39bb04316943c7562b645f8f78b12e54d430f2a4a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4d8efc9f2ad837ea5b63d39bb04316943c7562b645f8f78b12e54d430f2a4a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d2cd7978a2b2d5a571096a1f0172b76414dadc75fb54395e05d1f344f04e1a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d2cd7978a2b2d5a571096a1f0172b76414dadc75fb54395e05d1f344f04e1a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2188c1bcb129415bc4e9a76339bad4f973c3778a1c5a0735b15a8e0ccc17f5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2188c1bcb129415bc4e9a76339bad4f973c3778a1c5a0735b15a8e0ccc17f5ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1417d49afce3f61158d9158a6c9a55e186a1032fb3c4ad275c63b6c9df81dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1417d49afce3f61158d9158a6c9a55e186a1032fb3c4ad275c63b6c9df81dd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f6759044b25748d81e11825c925bc83b0caa70bc3bc5f930554ac6f8a19cabb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f6759044b25748d81e11825c925bc83b0caa70bc3bc5f930554ac6f8a19cabb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6eda6a197766562ad75d8ccf23a1e8998c02f7b6f2ac62fb7cc1ff8384b3d0cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6eda6a197766562ad75d8ccf23a1e8998c02f7b6f2ac62fb7cc1ff8384b3d0cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6xvsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:28Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:28 crc kubenswrapper[4685]: I0321 03:48:28.509479 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7jcm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd9b1743-6b69-46d3-a429-6f83bf43317a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a947393ce78ff4c9dddd93eb5ee7cb034673ebb6f0391cb868b473028074aa46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x266z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7jcm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:28Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:28 crc kubenswrapper[4685]: I0321 03:48:28.520678 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cea46fe2-4e41-43ab-a069-cb30fb4e732c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f8af9aaf0940271c159c00b04ffb7a1dd9471caa93ad4c7bd4507461a095923\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5ntg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://682dda84970818843427fd441cf0359877cff12a6a10a7f7fad13d60c5ca62f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5ntg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7r9cg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:28Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:28 crc kubenswrapper[4685]: I0321 03:48:28.536286 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aab80bc7857e9a3d86f540c725f4e35d0d86f979abed1ea5c9cddb4a6263924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d981cec0491154ace10ce4f437d1101c8dd69c34129a42b7dd5af226f8b14b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:28Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:28 crc kubenswrapper[4685]: I0321 03:48:28.548876 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fd9d618-b4ed-4942-b915-76dc59fb834a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba587c4fe2f05966282b50ba5236b9f3d9ef6de63f72c70ae9f7a5222cb8b904\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60c96edd458d05f217a2e9f07a44bd221303d821a790382a82cff0b912d48f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8583e9b5dff82d5df52e281ba4069e9259b1c8fe3d1b8121d0e9f3f9e97d47b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf0ba4af6f1f48fcb9ccf07dec53dc3ff1835a83dbf535bc48feb68fd646e78f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c896022c1e0726ccc5a54da9c432a0ff4082158ad76abcea60c0a1427c69134b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T03:47:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 03:47:19.136464 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 03:47:19.136735 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 03:47:19.138007 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2287411523/tls.crt::/tmp/serving-cert-2287411523/tls.key\\\\\\\"\\\\nI0321 03:47:19.320330 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 03:47:19.322862 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 03:47:19.322879 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 03:47:19.322902 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 03:47:19.322907 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 03:47:19.326922 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0321 03:47:19.326936 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 03:47:19.326989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 03:47:19.326999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 03:47:19.327009 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0321 03:47:19.327019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 03:47:19.327030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 03:47:19.327039 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 03:47:19.328267 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:18Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:48:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://182ad351b6c632ea64087d4784ea919d5c21165dc5d0373fa35db7f7f1eea435\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52df5756c4d3e8d1087a5a38b3e2b9966fb492d666c43773aa12a4cd83f6158d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52df5756c4d3e8d1087a5a38b3e2b9966fb492d666c43773aa12a4cd83f6158d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:46:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:46:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:46:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:28Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:28 crc kubenswrapper[4685]: I0321 03:48:28.559309 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cedbf5aa59177e67115d9dbf436805563022b6cebdd1c909d460d7997af5333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:28Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:29 crc kubenswrapper[4685]: I0321 03:48:29.261629 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7jcm2_cd9b1743-6b69-46d3-a429-6f83bf43317a/kube-multus/0.log" Mar 21 03:48:29 crc kubenswrapper[4685]: I0321 03:48:29.261672 4685 generic.go:334] "Generic (PLEG): container finished" podID="cd9b1743-6b69-46d3-a429-6f83bf43317a" containerID="a947393ce78ff4c9dddd93eb5ee7cb034673ebb6f0391cb868b473028074aa46" exitCode=1 Mar 21 03:48:29 crc kubenswrapper[4685]: I0321 03:48:29.261703 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7jcm2" event={"ID":"cd9b1743-6b69-46d3-a429-6f83bf43317a","Type":"ContainerDied","Data":"a947393ce78ff4c9dddd93eb5ee7cb034673ebb6f0391cb868b473028074aa46"} Mar 21 03:48:29 crc kubenswrapper[4685]: I0321 03:48:29.262073 4685 scope.go:117] "RemoveContainer" containerID="a947393ce78ff4c9dddd93eb5ee7cb034673ebb6f0391cb868b473028074aa46" Mar 21 03:48:29 crc kubenswrapper[4685]: I0321 03:48:29.279556 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df1f675def99930733ba3d2e467e3c07307a00cd8a10f0c3dee668de18f92dbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:29Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:29 crc kubenswrapper[4685]: I0321 03:48:29.292028 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:29Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:29 crc kubenswrapper[4685]: I0321 03:48:29.300404 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v9rdl" Mar 21 03:48:29 crc kubenswrapper[4685]: I0321 03:48:29.300450 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 03:48:29 crc kubenswrapper[4685]: I0321 03:48:29.300482 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 03:48:29 crc kubenswrapper[4685]: E0321 03:48:29.300544 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v9rdl" podUID="fda9b1ff-e4a8-4d15-8f7b-2974991cd252" Mar 21 03:48:29 crc kubenswrapper[4685]: I0321 03:48:29.300413 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 03:48:29 crc kubenswrapper[4685]: E0321 03:48:29.300635 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 03:48:29 crc kubenswrapper[4685]: E0321 03:48:29.300697 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 03:48:29 crc kubenswrapper[4685]: E0321 03:48:29.300746 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 03:48:29 crc kubenswrapper[4685]: I0321 03:48:29.304126 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:29Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:29 crc kubenswrapper[4685]: I0321 03:48:29.317450 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-v9rdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fda9b1ff-e4a8-4d15-8f7b-2974991cd252\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrczq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrczq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-v9rdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:29Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:29 crc kubenswrapper[4685]: I0321 03:48:29.334461 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08dfc393-0ddb-4bde-9b1f-2a48549f4549\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ad07f9e8bc73e15364d59e6baf04c54550518537bac8cbfcf180b423a1da4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://beb9850d14f3c89154f1960550eed4073643c6b70832bb3958176ac924f5a9fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34752fc66f7ecfa5c0a9a761bc6a69d21ab1fa332ecd79948cb6b1ea34a8a8b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2215b819a38926d6f34fede86a32762a942f00bc552ca9c0b8df5f17ed47c10a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d89981c8d56dfb581289dc794c8402f5fde866fd6948c2c0a1940eb3ba99a320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://473ea0e8600e10ea2a695b213d69795c56c1a399cbc5dc82ddb34f7261a034e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db2f167cea866dc3e14a9c0f6e343206b5e7d73d4c6971ddae722a4a29479cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db2f167cea866dc3e14a9c0f6e343206b5e7d73d4c6971ddae722a4a29479cf1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T03:48:10Z\\\",\\\"message\\\":\\\".182158 6950 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0321 03:48:10.182201 6950 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0321 03:48:10.182262 6950 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0321 03:48:10.182349 6950 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0321 03:48:10.182419 6950 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0321 03:48:10.182472 6950 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0321 03:48:10.182494 6950 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0321 03:48:10.182550 6950 factory.go:656] Stopping watch factory\\\\nI0321 03:48:10.182584 6950 handler.go:208] Removed *v1.Node event handler 7\\\\nI0321 03:48:10.182225 6950 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0321 03:48:10.182774 6950 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0321 03:48:10.182795 6950 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0321 03:48:10.182807 6950 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0321 03:48:10.182867 6950 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0321 03:48:10.182891 6950 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T03:48:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-cpfzk_openshift-ovn-kubernetes(08dfc393-0ddb-4bde-9b1f-2a48549f4549)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86328f182f7db62c6133733a4a4b9171f479c5b134c11250bb1839234544f485\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cpfzk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:29Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:29 crc kubenswrapper[4685]: I0321 03:48:29.343904 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mlsb2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b79f01f-bf05-4f7d-b816-6ef01f21e949\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://415d35423a66454f5ed06ba8facb7adbda91fae370e4d1d8a022cf7002e7b8bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6tf6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mlsb2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:29Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:29 crc kubenswrapper[4685]: I0321 03:48:29.357078 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"591a6428-2384-44dd-826d-f6d2cf76c794\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b6e897d496eb8cc32a4f3c51a65335a9594f50c7010a0f022f54722edfd38e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fe44d24bf93a4fdf116cd039951b919f771648e1a8c53df1507569b821dc8aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://405513730b33c9e9981a2c99d2e2c5897042c1a395d9fc099ad6818a6352cb2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33920b49011815c1b030d555584aee220e8fede3b4f9ff5f8a5f554d6e1d8d02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33920b49011815c1b030d555584aee220e8fede3b4f9ff5f8a5f554d6e1d8d02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:46:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:46:09Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:46:08Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:29Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:29 crc kubenswrapper[4685]: I0321 03:48:29.367133 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74d074e3-02ec-4391-b150-07bea56db3c9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a85a4c157d29725897cd0fd271fa9844551cbb95dbc21e085684115987d8e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5af33f2ffea677fec9bcbaa1de7545651b1021c12e6ea778649d9b6b160b6b8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5af33f2ffea677fec9bcbaa1de7545651b1021c12e6ea778649d9b6b160b6b8e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:46:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:46:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:46:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:29Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:29 crc kubenswrapper[4685]: I0321 03:48:29.377429 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ztl6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dea34cf-d6c6-42fd-b4aa-8e175c6a78f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438ee4ce0e664eb23ee83b20eecbc81dd4f5ad12e50e98785be3696b58528952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sc2qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ztl6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:29Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:29 crc kubenswrapper[4685]: I0321 03:48:29.386689 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jrr5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2521a678-ad6c-464b-bf7b-c4f6237c2822\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://749ea865e46b19e2475b94b696bb2be8bc27add3f2ba2420c38b9d67b2b8ddd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc29c132abc0c440dd96bf14118de27e46aa41322e61ce3d685eb43962330cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jrr5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:29Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:29 crc kubenswrapper[4685]: I0321 03:48:29.399490 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29e0ccb5-0299-4b2a-8138-694a1bb786db\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0be6102e44775f379b67814ed8f979bf81153bd68b519bc5c5e6cd2e3cb8169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://229992b297c1eb9aa6f92e56505bdf819e392fb777636759549854330bf022ab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T03:46:41Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0321 03:46:10.460102 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0321 03:46:10.462103 1 observer_polling.go:159] Starting file observer\\\\nI0321 03:46:10.493091 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0321 03:46:10.497028 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0321 03:46:40.990647 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T03:46:09Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa74801f196e57343a98011d671414abf1f1ebf4d7b962be522f0b6cad777acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41664fe1fb74ab3c38846eaf0a2b0bf46f5b0c4a2d2b3dfadb1a5e02e5a66e81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f2aa64984fe04680e1c00621e521e1a2a6eb5c2c2696c88fee33cc6eaa528f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:46:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:29Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:29 crc kubenswrapper[4685]: I0321 03:48:29.410476 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:29Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:29 crc kubenswrapper[4685]: I0321 03:48:29.424367 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7jcm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd9b1743-6b69-46d3-a429-6f83bf43317a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a947393ce78ff4c9dddd93eb5ee7cb034673ebb6f0391cb868b473028074aa46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a947393ce78ff4c9dddd93eb5ee7cb034673ebb6f0391cb868b473028074aa46\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T03:48:28Z\\\",\\\"message\\\":\\\"2026-03-21T03:47:42+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_20b06b61-cb28-4223-baf7-e69f86682674\\\\n2026-03-21T03:47:42+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_20b06b61-cb28-4223-baf7-e69f86682674 to /host/opt/cni/bin/\\\\n2026-03-21T03:47:43Z [verbose] multus-daemon started\\\\n2026-03-21T03:47:43Z [verbose] Readiness Indicator file check\\\\n2026-03-21T03:48:28Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x266z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7jcm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:29Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:29 crc kubenswrapper[4685]: I0321 03:48:29.433550 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cea46fe2-4e41-43ab-a069-cb30fb4e732c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f8af9aaf0940271c159c00b04ffb7a1dd9471caa93ad4c7bd4507461a095923\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5ntg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://682dda84970818843427fd441cf0359877cff12a6a10a7f7fad13d60c5ca62f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5ntg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7r9cg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:29Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:29 crc kubenswrapper[4685]: I0321 03:48:29.445239 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aab80bc7857e9a3d86f540c725f4e35d0d86f979abed1ea5c9cddb4a6263924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d981cec0491154ace10ce4f437d1101c8dd69c34129a42b7dd5af226f8b14b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:29Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:29 crc kubenswrapper[4685]: I0321 03:48:29.459142 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6xvsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5cb18bc-bda7-463e-98fe-6d8ff293b949\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b3a945a5915df9df43b15e02b30e3ffa4b0f0dd4e9283d54f80b4adb56f368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d8efc9f2ad837ea5b63d39bb04316943c7562b645f8f78b12e54d430f2a4a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4d8efc9f2ad837ea5b63d39bb04316943c7562b645f8f78b12e54d430f2a4a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d2cd7978a2b2d5a571096a1f0172b76414dadc75fb54395e05d1f344f04e1a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d2cd7978a2b2d5a571096a1f0172b76414dadc75fb54395e05d1f344f04e1a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2188c1bcb129415bc4e9a76339bad4f973c3778a1c5a0735b15a8e0ccc17f5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2188c1bcb129415bc4e9a76339bad4f973c3778a1c5a0735b15a8e0ccc17f5ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1417d49afce3f61158d9158a6c9a55e186a1032fb3c4ad275c63b6c9df81dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1417d49afce3f61158d9158a6c9a55e186a1032fb3c4ad275c63b6c9df81dd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f6759044b25748d81e11825c925bc83b0caa70bc3bc5f930554ac6f8a19cabb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f6759044b25748d81e11825c925bc83b0caa70bc3bc5f930554ac6f8a19cabb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6eda6a197766562ad75d8ccf23a1e8998c02f7b6f2ac62fb7cc1ff8384b3d0cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6eda6a197766562ad75d8ccf23a1e8998c02f7b6f2ac62fb7cc1ff8384b3d0cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6xvsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:29Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:29 crc kubenswrapper[4685]: I0321 03:48:29.468794 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cedbf5aa59177e67115d9dbf436805563022b6cebdd1c909d460d7997af5333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:29Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:29 crc kubenswrapper[4685]: I0321 03:48:29.480226 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fd9d618-b4ed-4942-b915-76dc59fb834a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba587c4fe2f05966282b50ba5236b9f3d9ef6de63f72c70ae9f7a5222cb8b904\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60c96edd458d05f217a2e9f07a44bd221303d821a790382a82cff0b912d48f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8583e9b5dff82d5df52e281ba4069e9259b1c8fe3d1b8121d0e9f3f9e97d47b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf0ba4af6f1f48fcb9ccf07dec53dc3ff1835a83dbf535bc48feb68fd646e78f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c896022c1e0726ccc5a54da9c432a0ff4082158ad76abcea60c0a1427c69134b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T03:47:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 03:47:19.136464 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 03:47:19.136735 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 03:47:19.138007 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2287411523/tls.crt::/tmp/serving-cert-2287411523/tls.key\\\\\\\"\\\\nI0321 03:47:19.320330 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 03:47:19.322862 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 03:47:19.322879 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 03:47:19.322902 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 03:47:19.322907 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 03:47:19.326922 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0321 03:47:19.326936 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 03:47:19.326989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 03:47:19.326999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 03:47:19.327009 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0321 03:47:19.327019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 03:47:19.327030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 03:47:19.327039 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 03:47:19.328267 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:18Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:48:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://182ad351b6c632ea64087d4784ea919d5c21165dc5d0373fa35db7f7f1eea435\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52df5756c4d3e8d1087a5a38b3e2b9966fb492d666c43773aa12a4cd83f6158d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52df5756c4d3e8d1087a5a38b3e2b9966fb492d666c43773aa12a4cd83f6158d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:46:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:46:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:46:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:29Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:30 crc kubenswrapper[4685]: I0321 03:48:30.266353 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7jcm2_cd9b1743-6b69-46d3-a429-6f83bf43317a/kube-multus/0.log" Mar 21 03:48:30 crc kubenswrapper[4685]: I0321 03:48:30.266410 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7jcm2" event={"ID":"cd9b1743-6b69-46d3-a429-6f83bf43317a","Type":"ContainerStarted","Data":"ffa428e52a3c6324be6bace33035c1626678061a65e2badd389c6e93850ce25f"} Mar 21 03:48:30 crc kubenswrapper[4685]: I0321 03:48:30.279605 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aab80bc7857e9a3d86f540c725f4e35d0d86f979abed1ea5c9cddb4a6263924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d981cec0491154ace10ce4f437d1101c8dd69c34129a42b7dd5af226f8b14b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:30Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:30 crc kubenswrapper[4685]: I0321 03:48:30.295624 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6xvsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5cb18bc-bda7-463e-98fe-6d8ff293b949\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b3a945a5915df9df43b15e02b30e3ffa4b0f0dd4e9283d54f80b4adb56f368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d8efc9f2ad837ea5b63d39bb04316943c7562b645f8f78b12e54d430f2a4a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4d8efc9f2ad837ea5b63d39bb04316943c7562b645f8f78b12e54d430f2a4a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d2cd7978a2b2d5a571096a1f0172b76414dadc75fb54395e05d1f344f04e1a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d2cd7978a2b2d5a571096a1f0172b76414dadc75fb54395e05d1f344f04e1a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2188c1bcb129415bc4e9a76339bad4f973c3778a1c5a0735b15a8e0ccc17f5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2188c1bcb129415bc4e9a76339bad4f973c3778a1c5a0735b15a8e0ccc17f5ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1417d49afce3f61158d9158a6c9a55e186a1032fb3c4ad275c63b6c9df81dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1417d49afce3f61158d9158a6c9a55e186a1032fb3c4ad275c63b6c9df81dd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f6759044b25748d81e11825c925bc83b0caa70bc3bc5f930554ac6f8a19cabb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f6759044b25748d81e11825c925bc83b0caa70bc3bc5f930554ac6f8a19cabb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6eda6a197766562ad75d8ccf23a1e8998c02f7b6f2ac62fb7cc1ff8384b3d0cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6eda6a197766562ad75d8ccf23a1e8998c02f7b6f2ac62fb7cc1ff8384b3d0cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6xvsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:30Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:30 crc kubenswrapper[4685]: I0321 03:48:30.314598 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7jcm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd9b1743-6b69-46d3-a429-6f83bf43317a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffa428e52a3c6324be6bace33035c1626678061a65e2badd389c6e93850ce25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a947393ce78ff4c9dddd93eb5ee7cb034673ebb6f0391cb868b473028074aa46\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T03:48:28Z\\\",\\\"message\\\":\\\"2026-03-21T03:47:42+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_20b06b61-cb28-4223-baf7-e69f86682674\\\\n2026-03-21T03:47:42+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_20b06b61-cb28-4223-baf7-e69f86682674 to /host/opt/cni/bin/\\\\n2026-03-21T03:47:43Z [verbose] multus-daemon started\\\\n2026-03-21T03:47:43Z [verbose] Readiness Indicator file check\\\\n2026-03-21T03:48:28Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:48:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x266z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7jcm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:30Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:30 crc kubenswrapper[4685]: I0321 03:48:30.326466 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cea46fe2-4e41-43ab-a069-cb30fb4e732c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f8af9aaf0940271c159c00b04ffb7a1dd9471caa93ad4c7bd4507461a095923\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5ntg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://682dda84970818843427fd441cf0359877cff12a6a10a7f7fad13d60c5ca62f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5ntg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7r9cg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:30Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:30 crc kubenswrapper[4685]: I0321 03:48:30.340504 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fd9d618-b4ed-4942-b915-76dc59fb834a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba587c4fe2f05966282b50ba5236b9f3d9ef6de63f72c70ae9f7a5222cb8b904\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60c96edd458d05f217a2e9f07a44bd221303d821a790382a82cff0b912d48f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8583e9b5dff82d5df52e281ba4069e9259b1c8fe3d1b8121d0e9f3f9e97d47b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf0ba4af6f1f48fcb9ccf07dec53dc3ff1835a83dbf535bc48feb68fd646e78f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c896022c1e0726ccc5a54da9c432a0ff4082158ad76abcea60c0a1427c69134b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T03:47:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 03:47:19.136464 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 03:47:19.136735 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 03:47:19.138007 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2287411523/tls.crt::/tmp/serving-cert-2287411523/tls.key\\\\\\\"\\\\nI0321 03:47:19.320330 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 03:47:19.322862 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 03:47:19.322879 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 03:47:19.322902 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 03:47:19.322907 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 03:47:19.326922 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0321 03:47:19.326936 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 03:47:19.326989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 03:47:19.326999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 03:47:19.327009 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0321 03:47:19.327019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 03:47:19.327030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 03:47:19.327039 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 03:47:19.328267 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:18Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:48:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://182ad351b6c632ea64087d4784ea919d5c21165dc5d0373fa35db7f7f1eea435\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52df5756c4d3e8d1087a5a38b3e2b9966fb492d666c43773aa12a4cd83f6158d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52df5756c4d3e8d1087a5a38b3e2b9966fb492d666c43773aa12a4cd83f6158d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:46:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:46:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:46:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:30Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:30 crc kubenswrapper[4685]: I0321 03:48:30.353219 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cedbf5aa59177e67115d9dbf436805563022b6cebdd1c909d460d7997af5333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:30Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:30 crc kubenswrapper[4685]: I0321 03:48:30.364177 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"591a6428-2384-44dd-826d-f6d2cf76c794\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b6e897d496eb8cc32a4f3c51a65335a9594f50c7010a0f022f54722edfd38e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fe44d24bf93a4fdf116cd039951b919f771648e1a8c53df1507569b821dc8aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://405513730b33c9e9981a2c99d2e2c5897042c1a395d9fc099ad6818a6352cb2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33920b49011815c1b030d555584aee220e8fede3b4f9ff5f8a5f554d6e1d8d02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33920b49011815c1b030d555584aee220e8fede3b4f9ff5f8a5f554d6e1d8d02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:46:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:46:09Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:46:08Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:30Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:30 crc kubenswrapper[4685]: I0321 03:48:30.373528 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74d074e3-02ec-4391-b150-07bea56db3c9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a85a4c157d29725897cd0fd271fa9844551cbb95dbc21e085684115987d8e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5af33f2ffea677fec9bcbaa1de7545651b1021c12e6ea778649d9b6b160b6b8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5af33f2ffea677fec9bcbaa1de7545651b1021c12e6ea778649d9b6b160b6b8e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:46:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:46:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:46:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:30Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:30 crc kubenswrapper[4685]: I0321 03:48:30.387205 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df1f675def99930733ba3d2e467e3c07307a00cd8a10f0c3dee668de18f92dbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:30Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:30 crc kubenswrapper[4685]: I0321 03:48:30.397993 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:30Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:30 crc kubenswrapper[4685]: I0321 03:48:30.412164 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:30Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:30 crc kubenswrapper[4685]: I0321 03:48:30.421317 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-v9rdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fda9b1ff-e4a8-4d15-8f7b-2974991cd252\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrczq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrczq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-v9rdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:30Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:30 crc kubenswrapper[4685]: I0321 03:48:30.440197 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08dfc393-0ddb-4bde-9b1f-2a48549f4549\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ad07f9e8bc73e15364d59e6baf04c54550518537bac8cbfcf180b423a1da4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://beb9850d14f3c89154f1960550eed4073643c6b70832bb3958176ac924f5a9fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34752fc66f7ecfa5c0a9a761bc6a69d21ab1fa332ecd79948cb6b1ea34a8a8b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2215b819a38926d6f34fede86a32762a942f00bc552ca9c0b8df5f17ed47c10a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d89981c8d56dfb581289dc794c8402f5fde866fd6948c2c0a1940eb3ba99a320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://473ea0e8600e10ea2a695b213d69795c56c1a399cbc5dc82ddb34f7261a034e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db2f167cea866dc3e14a9c0f6e343206b5e7d73d4c6971ddae722a4a29479cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db2f167cea866dc3e14a9c0f6e343206b5e7d73d4c6971ddae722a4a29479cf1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T03:48:10Z\\\",\\\"message\\\":\\\".182158 6950 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0321 03:48:10.182201 6950 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0321 03:48:10.182262 6950 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0321 03:48:10.182349 6950 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0321 03:48:10.182419 6950 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0321 03:48:10.182472 6950 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0321 03:48:10.182494 6950 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0321 03:48:10.182550 6950 factory.go:656] Stopping watch factory\\\\nI0321 03:48:10.182584 6950 handler.go:208] Removed *v1.Node event handler 7\\\\nI0321 03:48:10.182225 6950 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0321 03:48:10.182774 6950 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0321 03:48:10.182795 6950 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0321 03:48:10.182807 6950 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0321 03:48:10.182867 6950 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0321 03:48:10.182891 6950 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T03:48:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-cpfzk_openshift-ovn-kubernetes(08dfc393-0ddb-4bde-9b1f-2a48549f4549)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86328f182f7db62c6133733a4a4b9171f479c5b134c11250bb1839234544f485\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cpfzk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:30Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:30 crc kubenswrapper[4685]: I0321 03:48:30.450965 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mlsb2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b79f01f-bf05-4f7d-b816-6ef01f21e949\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://415d35423a66454f5ed06ba8facb7adbda91fae370e4d1d8a022cf7002e7b8bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6tf6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mlsb2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:30Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:30 crc kubenswrapper[4685]: I0321 03:48:30.464011 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ztl6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dea34cf-d6c6-42fd-b4aa-8e175c6a78f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438ee4ce0e664eb23ee83b20eecbc81dd4f5ad12e50e98785be3696b58528952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sc2qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ztl6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:30Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:30 crc kubenswrapper[4685]: I0321 03:48:30.476673 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29e0ccb5-0299-4b2a-8138-694a1bb786db\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0be6102e44775f379b67814ed8f979bf81153bd68b519bc5c5e6cd2e3cb8169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://229992b297c1eb9aa6f92e56505bdf819e392fb777636759549854330bf022ab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T03:46:41Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0321 03:46:10.460102 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0321 03:46:10.462103 1 observer_polling.go:159] Starting file observer\\\\nI0321 03:46:10.493091 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0321 03:46:10.497028 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0321 03:46:40.990647 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T03:46:09Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa74801f196e57343a98011d671414abf1f1ebf4d7b962be522f0b6cad777acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41664fe1fb74ab3c38846eaf0a2b0bf46f5b0c4a2d2b3dfadb1a5e02e5a66e81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f2aa64984fe04680e1c00621e521e1a2a6eb5c2c2696c88fee33cc6eaa528f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:46:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:30Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:30 crc kubenswrapper[4685]: I0321 03:48:30.491157 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:30Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:30 crc kubenswrapper[4685]: I0321 03:48:30.504115 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jrr5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2521a678-ad6c-464b-bf7b-c4f6237c2822\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://749ea865e46b19e2475b94b696bb2be8bc27add3f2ba2420c38b9d67b2b8ddd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc29c132abc0c440dd96bf14118de27e46aa41322e61ce3d685eb43962330cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jrr5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:30Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:31 crc kubenswrapper[4685]: I0321 03:48:31.069116 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:31 crc kubenswrapper[4685]: I0321 03:48:31.069167 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:31 crc kubenswrapper[4685]: I0321 03:48:31.069179 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:31 crc kubenswrapper[4685]: I0321 03:48:31.069198 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:31 crc kubenswrapper[4685]: I0321 03:48:31.069209 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:31Z","lastTransitionTime":"2026-03-21T03:48:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:31 crc kubenswrapper[4685]: E0321 03:48:31.081744 4685 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8bd8455e-cd7f-4a01-9ab2-39696fd22c82\\\",\\\"systemUUID\\\":\\\"9e08e2bc-a5ba-4ecf-a5ec-5e30e6b4a1fa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:31Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:31 crc kubenswrapper[4685]: I0321 03:48:31.086780 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:31 crc kubenswrapper[4685]: I0321 03:48:31.086897 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:31 crc kubenswrapper[4685]: I0321 03:48:31.086939 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:31 crc kubenswrapper[4685]: I0321 03:48:31.086959 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:31 crc kubenswrapper[4685]: I0321 03:48:31.086971 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:31Z","lastTransitionTime":"2026-03-21T03:48:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:31 crc kubenswrapper[4685]: E0321 03:48:31.103084 4685 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8bd8455e-cd7f-4a01-9ab2-39696fd22c82\\\",\\\"systemUUID\\\":\\\"9e08e2bc-a5ba-4ecf-a5ec-5e30e6b4a1fa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:31Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:31 crc kubenswrapper[4685]: I0321 03:48:31.107170 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:31 crc kubenswrapper[4685]: I0321 03:48:31.107215 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:31 crc kubenswrapper[4685]: I0321 03:48:31.107224 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:31 crc kubenswrapper[4685]: I0321 03:48:31.107245 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:31 crc kubenswrapper[4685]: I0321 03:48:31.107256 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:31Z","lastTransitionTime":"2026-03-21T03:48:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:31 crc kubenswrapper[4685]: E0321 03:48:31.118758 4685 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8bd8455e-cd7f-4a01-9ab2-39696fd22c82\\\",\\\"systemUUID\\\":\\\"9e08e2bc-a5ba-4ecf-a5ec-5e30e6b4a1fa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:31Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:31 crc kubenswrapper[4685]: I0321 03:48:31.122570 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:31 crc kubenswrapper[4685]: I0321 03:48:31.122596 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:31 crc kubenswrapper[4685]: I0321 03:48:31.122606 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:31 crc kubenswrapper[4685]: I0321 03:48:31.122620 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:31 crc kubenswrapper[4685]: I0321 03:48:31.122630 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:31Z","lastTransitionTime":"2026-03-21T03:48:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:31 crc kubenswrapper[4685]: E0321 03:48:31.140929 4685 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8bd8455e-cd7f-4a01-9ab2-39696fd22c82\\\",\\\"systemUUID\\\":\\\"9e08e2bc-a5ba-4ecf-a5ec-5e30e6b4a1fa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:31Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:31 crc kubenswrapper[4685]: I0321 03:48:31.144802 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:31 crc kubenswrapper[4685]: I0321 03:48:31.144930 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:31 crc kubenswrapper[4685]: I0321 03:48:31.144956 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:31 crc kubenswrapper[4685]: I0321 03:48:31.144985 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:31 crc kubenswrapper[4685]: I0321 03:48:31.145011 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:31Z","lastTransitionTime":"2026-03-21T03:48:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:31 crc kubenswrapper[4685]: E0321 03:48:31.166138 4685 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8bd8455e-cd7f-4a01-9ab2-39696fd22c82\\\",\\\"systemUUID\\\":\\\"9e08e2bc-a5ba-4ecf-a5ec-5e30e6b4a1fa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:31Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:31 crc kubenswrapper[4685]: E0321 03:48:31.166662 4685 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 21 03:48:31 crc kubenswrapper[4685]: I0321 03:48:31.300914 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 03:48:31 crc kubenswrapper[4685]: I0321 03:48:31.300914 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v9rdl" Mar 21 03:48:31 crc kubenswrapper[4685]: I0321 03:48:31.301033 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 03:48:31 crc kubenswrapper[4685]: E0321 03:48:31.301505 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v9rdl" podUID="fda9b1ff-e4a8-4d15-8f7b-2974991cd252" Mar 21 03:48:31 crc kubenswrapper[4685]: I0321 03:48:31.301074 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 03:48:31 crc kubenswrapper[4685]: E0321 03:48:31.301631 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 03:48:31 crc kubenswrapper[4685]: E0321 03:48:31.301714 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 03:48:31 crc kubenswrapper[4685]: E0321 03:48:31.301467 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 03:48:33 crc kubenswrapper[4685]: I0321 03:48:33.300736 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 03:48:33 crc kubenswrapper[4685]: I0321 03:48:33.300769 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 03:48:33 crc kubenswrapper[4685]: I0321 03:48:33.300917 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v9rdl" Mar 21 03:48:33 crc kubenswrapper[4685]: E0321 03:48:33.300988 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 03:48:33 crc kubenswrapper[4685]: E0321 03:48:33.301157 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v9rdl" podUID="fda9b1ff-e4a8-4d15-8f7b-2974991cd252" Mar 21 03:48:33 crc kubenswrapper[4685]: E0321 03:48:33.301368 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 03:48:33 crc kubenswrapper[4685]: I0321 03:48:33.302108 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 03:48:33 crc kubenswrapper[4685]: E0321 03:48:33.302258 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 03:48:33 crc kubenswrapper[4685]: E0321 03:48:33.413586 4685 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 03:48:35 crc kubenswrapper[4685]: I0321 03:48:35.300734 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v9rdl" Mar 21 03:48:35 crc kubenswrapper[4685]: I0321 03:48:35.300894 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 03:48:35 crc kubenswrapper[4685]: E0321 03:48:35.300940 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v9rdl" podUID="fda9b1ff-e4a8-4d15-8f7b-2974991cd252" Mar 21 03:48:35 crc kubenswrapper[4685]: E0321 03:48:35.301099 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 03:48:35 crc kubenswrapper[4685]: I0321 03:48:35.300734 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 03:48:35 crc kubenswrapper[4685]: E0321 03:48:35.301231 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 03:48:35 crc kubenswrapper[4685]: I0321 03:48:35.300734 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 03:48:35 crc kubenswrapper[4685]: E0321 03:48:35.301352 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 03:48:37 crc kubenswrapper[4685]: I0321 03:48:37.300645 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 03:48:37 crc kubenswrapper[4685]: I0321 03:48:37.300721 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v9rdl" Mar 21 03:48:37 crc kubenswrapper[4685]: I0321 03:48:37.300774 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 03:48:37 crc kubenswrapper[4685]: I0321 03:48:37.300650 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 03:48:37 crc kubenswrapper[4685]: E0321 03:48:37.300922 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 03:48:37 crc kubenswrapper[4685]: E0321 03:48:37.301054 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 03:48:37 crc kubenswrapper[4685]: E0321 03:48:37.301190 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v9rdl" podUID="fda9b1ff-e4a8-4d15-8f7b-2974991cd252" Mar 21 03:48:37 crc kubenswrapper[4685]: E0321 03:48:37.301330 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 03:48:38 crc kubenswrapper[4685]: I0321 03:48:38.328044 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fd9d618-b4ed-4942-b915-76dc59fb834a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba587c4fe2f05966282b50ba5236b9f3d9ef6de63f72c70ae9f7a5222cb8b904\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60c96edd458d05f217a2e9f07a44bd221303d821a790382a82cff0b912d48f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8583e9b5dff82d5df52e281ba4069e9259b1c8fe3d1b8121d0e9f3f9e97d47b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf0ba4af6f1f48fcb9ccf07dec53dc3ff1835a83dbf535bc48feb68fd646e78f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c896022c1e0726ccc5a54da9c432a0ff4082158ad76abcea60c0a1427c69134b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T03:47:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 03:47:19.136464 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 03:47:19.136735 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 03:47:19.138007 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2287411523/tls.crt::/tmp/serving-cert-2287411523/tls.key\\\\\\\"\\\\nI0321 03:47:19.320330 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 03:47:19.322862 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 03:47:19.322879 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 03:47:19.322902 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 03:47:19.322907 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 03:47:19.326922 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0321 03:47:19.326936 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 03:47:19.326989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 03:47:19.326999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 03:47:19.327009 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0321 03:47:19.327019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 03:47:19.327030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 03:47:19.327039 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 03:47:19.328267 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:18Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:48:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://182ad351b6c632ea64087d4784ea919d5c21165dc5d0373fa35db7f7f1eea435\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52df5756c4d3e8d1087a5a38b3e2b9966fb492d666c43773aa12a4cd83f6158d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52df5756c4d3e8d1087a5a38b3e2b9966fb492d666c43773aa12a4cd83f6158d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:46:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:46:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:46:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:38Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:38 crc kubenswrapper[4685]: I0321 03:48:38.344977 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cedbf5aa59177e67115d9dbf436805563022b6cebdd1c909d460d7997af5333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:38Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:38 crc kubenswrapper[4685]: I0321 03:48:38.377472 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08dfc393-0ddb-4bde-9b1f-2a48549f4549\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ad07f9e8bc73e15364d59e6baf04c54550518537bac8cbfcf180b423a1da4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://beb9850d14f3c89154f1960550eed4073643c6b70832bb3958176ac924f5a9fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34752fc66f7ecfa5c0a9a761bc6a69d21ab1fa332ecd79948cb6b1ea34a8a8b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2215b819a38926d6f34fede86a32762a942f00bc552ca9c0b8df5f17ed47c10a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d89981c8d56dfb581289dc794c8402f5fde866fd6948c2c0a1940eb3ba99a320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://473ea0e8600e10ea2a695b213d69795c56c1a399cbc5dc82ddb34f7261a034e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db2f167cea866dc3e14a9c0f6e343206b5e7d73d4c6971ddae722a4a29479cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db2f167cea866dc3e14a9c0f6e343206b5e7d73d4c6971ddae722a4a29479cf1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T03:48:10Z\\\",\\\"message\\\":\\\".182158 6950 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0321 03:48:10.182201 6950 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0321 03:48:10.182262 6950 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0321 03:48:10.182349 6950 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0321 03:48:10.182419 6950 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0321 03:48:10.182472 6950 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0321 03:48:10.182494 6950 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0321 03:48:10.182550 6950 factory.go:656] Stopping watch factory\\\\nI0321 03:48:10.182584 6950 handler.go:208] Removed *v1.Node event handler 7\\\\nI0321 03:48:10.182225 6950 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0321 03:48:10.182774 6950 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0321 03:48:10.182795 6950 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0321 03:48:10.182807 6950 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0321 03:48:10.182867 6950 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0321 03:48:10.182891 6950 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T03:48:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-cpfzk_openshift-ovn-kubernetes(08dfc393-0ddb-4bde-9b1f-2a48549f4549)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86328f182f7db62c6133733a4a4b9171f479c5b134c11250bb1839234544f485\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cpfzk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:38Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:38 crc kubenswrapper[4685]: I0321 03:48:38.393496 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mlsb2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b79f01f-bf05-4f7d-b816-6ef01f21e949\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://415d35423a66454f5ed06ba8facb7adbda91fae370e4d1d8a022cf7002e7b8bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6tf6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mlsb2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:38Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:38 crc kubenswrapper[4685]: E0321 03:48:38.414750 4685 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 03:48:38 crc kubenswrapper[4685]: I0321 03:48:38.420452 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"591a6428-2384-44dd-826d-f6d2cf76c794\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b6e897d496eb8cc32a4f3c51a65335a9594f50c7010a0f022f54722edfd38e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fe44d24bf93a4fdf116cd039951b919f771648e1a8c53df1507569b821dc8aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://405513730b33c9e9981a2c99d2e2c5897042c1a395d9fc099ad6818a6352cb2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33920b49011815c1b030d555584aee220e8fede3b4f9ff5f8a5f554d6e1d8d02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33920b49011815c1b030d555584aee220e8fede3b4f9ff5f8a5f554d6e1d8d02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:46:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:46:09Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:46:08Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:38Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:38 crc kubenswrapper[4685]: I0321 03:48:38.435182 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74d074e3-02ec-4391-b150-07bea56db3c9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a85a4c157d29725897cd0fd271fa9844551cbb95dbc21e085684115987d8e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5af33f2ffea677fec9bcbaa1de7545651b1021c12e6ea778649d9b6b160b6b8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5af33f2ffea677fec9bcbaa1de7545651b1021c12e6ea778649d9b6b160b6b8e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:46:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:46:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:46:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:38Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:38 crc kubenswrapper[4685]: I0321 03:48:38.454783 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df1f675def99930733ba3d2e467e3c07307a00cd8a10f0c3dee668de18f92dbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:38Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:38 crc kubenswrapper[4685]: I0321 03:48:38.469207 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:38Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:38 crc kubenswrapper[4685]: I0321 03:48:38.488182 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:38Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:38 crc kubenswrapper[4685]: I0321 03:48:38.502267 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-v9rdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fda9b1ff-e4a8-4d15-8f7b-2974991cd252\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrczq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrczq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-v9rdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:38Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:38 crc kubenswrapper[4685]: I0321 03:48:38.518894 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ztl6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dea34cf-d6c6-42fd-b4aa-8e175c6a78f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438ee4ce0e664eb23ee83b20eecbc81dd4f5ad12e50e98785be3696b58528952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sc2qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ztl6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:38Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:38 crc kubenswrapper[4685]: I0321 03:48:38.539413 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29e0ccb5-0299-4b2a-8138-694a1bb786db\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0be6102e44775f379b67814ed8f979bf81153bd68b519bc5c5e6cd2e3cb8169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://229992b297c1eb9aa6f92e56505bdf819e392fb777636759549854330bf022ab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T03:46:41Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0321 03:46:10.460102 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0321 03:46:10.462103 1 observer_polling.go:159] Starting file observer\\\\nI0321 03:46:10.493091 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0321 03:46:10.497028 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0321 03:46:40.990647 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T03:46:09Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa74801f196e57343a98011d671414abf1f1ebf4d7b962be522f0b6cad777acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41664fe1fb74ab3c38846eaf0a2b0bf46f5b0c4a2d2b3dfadb1a5e02e5a66e81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f2aa64984fe04680e1c00621e521e1a2a6eb5c2c2696c88fee33cc6eaa528f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:46:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:38Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:38 crc kubenswrapper[4685]: I0321 03:48:38.563418 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:38Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:38 crc kubenswrapper[4685]: I0321 03:48:38.581185 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jrr5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2521a678-ad6c-464b-bf7b-c4f6237c2822\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://749ea865e46b19e2475b94b696bb2be8bc27add3f2ba2420c38b9d67b2b8ddd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc29c132abc0c440dd96bf14118de27e46aa41322e61ce3d685eb43962330cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jrr5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:38Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:38 crc kubenswrapper[4685]: I0321 03:48:38.597389 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aab80bc7857e9a3d86f540c725f4e35d0d86f979abed1ea5c9cddb4a6263924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d981cec0491154ace10ce4f437d1101c8dd69c34129a42b7dd5af226f8b14b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:38Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:38 crc kubenswrapper[4685]: I0321 03:48:38.615569 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6xvsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5cb18bc-bda7-463e-98fe-6d8ff293b949\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b3a945a5915df9df43b15e02b30e3ffa4b0f0dd4e9283d54f80b4adb56f368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d8efc9f2ad837ea5b63d39bb04316943c7562b645f8f78b12e54d430f2a4a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4d8efc9f2ad837ea5b63d39bb04316943c7562b645f8f78b12e54d430f2a4a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d2cd7978a2b2d5a571096a1f0172b76414dadc75fb54395e05d1f344f04e1a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d2cd7978a2b2d5a571096a1f0172b76414dadc75fb54395e05d1f344f04e1a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2188c1bcb129415bc4e9a76339bad4f973c3778a1c5a0735b15a8e0ccc17f5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2188c1bcb129415bc4e9a76339bad4f973c3778a1c5a0735b15a8e0ccc17f5ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1417d49afce3f61158d9158a6c9a55e186a1032fb3c4ad275c63b6c9df81dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1417d49afce3f61158d9158a6c9a55e186a1032fb3c4ad275c63b6c9df81dd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f6759044b25748d81e11825c925bc83b0caa70bc3bc5f930554ac6f8a19cabb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f6759044b25748d81e11825c925bc83b0caa70bc3bc5f930554ac6f8a19cabb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6eda6a197766562ad75d8ccf23a1e8998c02f7b6f2ac62fb7cc1ff8384b3d0cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6eda6a197766562ad75d8ccf23a1e8998c02f7b6f2ac62fb7cc1ff8384b3d0cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6xvsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:38Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:38 crc kubenswrapper[4685]: I0321 03:48:38.629455 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7jcm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd9b1743-6b69-46d3-a429-6f83bf43317a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffa428e52a3c6324be6bace33035c1626678061a65e2badd389c6e93850ce25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a947393ce78ff4c9dddd93eb5ee7cb034673ebb6f0391cb868b473028074aa46\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T03:48:28Z\\\",\\\"message\\\":\\\"2026-03-21T03:47:42+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_20b06b61-cb28-4223-baf7-e69f86682674\\\\n2026-03-21T03:47:42+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_20b06b61-cb28-4223-baf7-e69f86682674 to /host/opt/cni/bin/\\\\n2026-03-21T03:47:43Z [verbose] multus-daemon started\\\\n2026-03-21T03:47:43Z [verbose] Readiness Indicator file check\\\\n2026-03-21T03:48:28Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:48:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x266z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7jcm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:38Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:38 crc kubenswrapper[4685]: I0321 03:48:38.657398 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cea46fe2-4e41-43ab-a069-cb30fb4e732c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f8af9aaf0940271c159c00b04ffb7a1dd9471caa93ad4c7bd4507461a095923\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5ntg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://682dda84970818843427fd441cf0359877cff12a6a10a7f7fad13d60c5ca62f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5ntg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7r9cg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:38Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:39 crc kubenswrapper[4685]: I0321 03:48:39.301127 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 03:48:39 crc kubenswrapper[4685]: I0321 03:48:39.301173 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 03:48:39 crc kubenswrapper[4685]: I0321 03:48:39.301267 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 03:48:39 crc kubenswrapper[4685]: I0321 03:48:39.301127 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v9rdl" Mar 21 03:48:39 crc kubenswrapper[4685]: E0321 03:48:39.301374 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 03:48:39 crc kubenswrapper[4685]: E0321 03:48:39.301624 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 03:48:39 crc kubenswrapper[4685]: E0321 03:48:39.301803 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v9rdl" podUID="fda9b1ff-e4a8-4d15-8f7b-2974991cd252" Mar 21 03:48:39 crc kubenswrapper[4685]: E0321 03:48:39.301930 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 03:48:41 crc kubenswrapper[4685]: I0321 03:48:41.301164 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v9rdl" Mar 21 03:48:41 crc kubenswrapper[4685]: I0321 03:48:41.301230 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 03:48:41 crc kubenswrapper[4685]: I0321 03:48:41.301187 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 03:48:41 crc kubenswrapper[4685]: E0321 03:48:41.301404 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v9rdl" podUID="fda9b1ff-e4a8-4d15-8f7b-2974991cd252" Mar 21 03:48:41 crc kubenswrapper[4685]: I0321 03:48:41.302022 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 03:48:41 crc kubenswrapper[4685]: E0321 03:48:41.302149 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 03:48:41 crc kubenswrapper[4685]: E0321 03:48:41.302175 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 03:48:41 crc kubenswrapper[4685]: E0321 03:48:41.302253 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 03:48:41 crc kubenswrapper[4685]: I0321 03:48:41.302597 4685 scope.go:117] "RemoveContainer" containerID="db2f167cea866dc3e14a9c0f6e343206b5e7d73d4c6971ddae722a4a29479cf1" Mar 21 03:48:41 crc kubenswrapper[4685]: I0321 03:48:41.364131 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:41 crc kubenswrapper[4685]: I0321 03:48:41.364888 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:41 crc kubenswrapper[4685]: I0321 03:48:41.364926 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:41 crc kubenswrapper[4685]: I0321 03:48:41.364962 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:41 crc kubenswrapper[4685]: I0321 03:48:41.364991 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:41Z","lastTransitionTime":"2026-03-21T03:48:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:41 crc kubenswrapper[4685]: E0321 03:48:41.394646 4685 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8bd8455e-cd7f-4a01-9ab2-39696fd22c82\\\",\\\"systemUUID\\\":\\\"9e08e2bc-a5ba-4ecf-a5ec-5e30e6b4a1fa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:41Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:41 crc kubenswrapper[4685]: I0321 03:48:41.401445 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:41 crc kubenswrapper[4685]: I0321 03:48:41.401502 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:41 crc kubenswrapper[4685]: I0321 03:48:41.401519 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:41 crc kubenswrapper[4685]: I0321 03:48:41.401546 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:41 crc kubenswrapper[4685]: I0321 03:48:41.401566 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:41Z","lastTransitionTime":"2026-03-21T03:48:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:41 crc kubenswrapper[4685]: E0321 03:48:41.425508 4685 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8bd8455e-cd7f-4a01-9ab2-39696fd22c82\\\",\\\"systemUUID\\\":\\\"9e08e2bc-a5ba-4ecf-a5ec-5e30e6b4a1fa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:41Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:41 crc kubenswrapper[4685]: I0321 03:48:41.432243 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:41 crc kubenswrapper[4685]: I0321 03:48:41.432430 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:41 crc kubenswrapper[4685]: I0321 03:48:41.432478 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:41 crc kubenswrapper[4685]: I0321 03:48:41.432645 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:41 crc kubenswrapper[4685]: I0321 03:48:41.432677 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:41Z","lastTransitionTime":"2026-03-21T03:48:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:41 crc kubenswrapper[4685]: E0321 03:48:41.455367 4685 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8bd8455e-cd7f-4a01-9ab2-39696fd22c82\\\",\\\"systemUUID\\\":\\\"9e08e2bc-a5ba-4ecf-a5ec-5e30e6b4a1fa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:41Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:41 crc kubenswrapper[4685]: I0321 03:48:41.461621 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:41 crc kubenswrapper[4685]: I0321 03:48:41.461666 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:41 crc kubenswrapper[4685]: I0321 03:48:41.461678 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:41 crc kubenswrapper[4685]: I0321 03:48:41.461697 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:41 crc kubenswrapper[4685]: I0321 03:48:41.461714 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:41Z","lastTransitionTime":"2026-03-21T03:48:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:41 crc kubenswrapper[4685]: E0321 03:48:41.482480 4685 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8bd8455e-cd7f-4a01-9ab2-39696fd22c82\\\",\\\"systemUUID\\\":\\\"9e08e2bc-a5ba-4ecf-a5ec-5e30e6b4a1fa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:41Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:41 crc kubenswrapper[4685]: I0321 03:48:41.487219 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:41 crc kubenswrapper[4685]: I0321 03:48:41.487275 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:41 crc kubenswrapper[4685]: I0321 03:48:41.487299 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:41 crc kubenswrapper[4685]: I0321 03:48:41.487333 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:41 crc kubenswrapper[4685]: I0321 03:48:41.487358 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:41Z","lastTransitionTime":"2026-03-21T03:48:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:41 crc kubenswrapper[4685]: E0321 03:48:41.506527 4685 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T03:48:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8bd8455e-cd7f-4a01-9ab2-39696fd22c82\\\",\\\"systemUUID\\\":\\\"9e08e2bc-a5ba-4ecf-a5ec-5e30e6b4a1fa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:41Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:41 crc kubenswrapper[4685]: E0321 03:48:41.506880 4685 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 21 03:48:42 crc kubenswrapper[4685]: I0321 03:48:42.322622 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cpfzk_08dfc393-0ddb-4bde-9b1f-2a48549f4549/ovnkube-controller/2.log" Mar 21 03:48:42 crc kubenswrapper[4685]: I0321 03:48:42.327353 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" event={"ID":"08dfc393-0ddb-4bde-9b1f-2a48549f4549","Type":"ContainerStarted","Data":"f4c5056ea79326435f55aafe0fc9580a47c394a0d04559cd50b5eb720b2f599a"} Mar 21 03:48:42 crc kubenswrapper[4685]: I0321 03:48:42.327829 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" Mar 21 03:48:42 crc kubenswrapper[4685]: I0321 03:48:42.354005 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aab80bc7857e9a3d86f540c725f4e35d0d86f979abed1ea5c9cddb4a6263924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d981cec0491154ace10ce4f437d1101c8dd69c34129a42b7dd5af226f8b14b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:42Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:42 crc kubenswrapper[4685]: I0321 03:48:42.377563 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6xvsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5cb18bc-bda7-463e-98fe-6d8ff293b949\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b3a945a5915df9df43b15e02b30e3ffa4b0f0dd4e9283d54f80b4adb56f368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d8efc9f2ad837ea5b63d39bb04316943c7562b645f8f78b12e54d430f2a4a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4d8efc9f2ad837ea5b63d39bb04316943c7562b645f8f78b12e54d430f2a4a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d2cd7978a2b2d5a571096a1f0172b76414dadc75fb54395e05d1f344f04e1a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d2cd7978a2b2d5a571096a1f0172b76414dadc75fb54395e05d1f344f04e1a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2188c1bcb129415bc4e9a76339bad4f973c3778a1c5a0735b15a8e0ccc17f5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2188c1bcb129415bc4e9a76339bad4f973c3778a1c5a0735b15a8e0ccc17f5ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1417d49afce3f61158d9158a6c9a55e186a1032fb3c4ad275c63b6c9df81dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1417d49afce3f61158d9158a6c9a55e186a1032fb3c4ad275c63b6c9df81dd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f6759044b25748d81e11825c925bc83b0caa70bc3bc5f930554ac6f8a19cabb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f6759044b25748d81e11825c925bc83b0caa70bc3bc5f930554ac6f8a19cabb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6eda6a197766562ad75d8ccf23a1e8998c02f7b6f2ac62fb7cc1ff8384b3d0cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6eda6a197766562ad75d8ccf23a1e8998c02f7b6f2ac62fb7cc1ff8384b3d0cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6xvsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:42Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:42 crc kubenswrapper[4685]: I0321 03:48:42.398015 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7jcm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd9b1743-6b69-46d3-a429-6f83bf43317a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffa428e52a3c6324be6bace33035c1626678061a65e2badd389c6e93850ce25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a947393ce78ff4c9dddd93eb5ee7cb034673ebb6f0391cb868b473028074aa46\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T03:48:28Z\\\",\\\"message\\\":\\\"2026-03-21T03:47:42+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_20b06b61-cb28-4223-baf7-e69f86682674\\\\n2026-03-21T03:47:42+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_20b06b61-cb28-4223-baf7-e69f86682674 to /host/opt/cni/bin/\\\\n2026-03-21T03:47:43Z [verbose] multus-daemon started\\\\n2026-03-21T03:47:43Z [verbose] Readiness Indicator file check\\\\n2026-03-21T03:48:28Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:48:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x266z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7jcm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:42Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:42 crc kubenswrapper[4685]: I0321 03:48:42.421248 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cea46fe2-4e41-43ab-a069-cb30fb4e732c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f8af9aaf0940271c159c00b04ffb7a1dd9471caa93ad4c7bd4507461a095923\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5ntg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://682dda84970818843427fd441cf0359877cff12a6a10a7f7fad13d60c5ca62f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5ntg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7r9cg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:42Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:42 crc kubenswrapper[4685]: I0321 03:48:42.447353 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fd9d618-b4ed-4942-b915-76dc59fb834a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba587c4fe2f05966282b50ba5236b9f3d9ef6de63f72c70ae9f7a5222cb8b904\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60c96edd458d05f217a2e9f07a44bd221303d821a790382a82cff0b912d48f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8583e9b5dff82d5df52e281ba4069e9259b1c8fe3d1b8121d0e9f3f9e97d47b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf0ba4af6f1f48fcb9ccf07dec53dc3ff1835a83dbf535bc48feb68fd646e78f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c896022c1e0726ccc5a54da9c432a0ff4082158ad76abcea60c0a1427c69134b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T03:47:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 03:47:19.136464 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 03:47:19.136735 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 03:47:19.138007 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2287411523/tls.crt::/tmp/serving-cert-2287411523/tls.key\\\\\\\"\\\\nI0321 03:47:19.320330 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 03:47:19.322862 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 03:47:19.322879 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 03:47:19.322902 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 03:47:19.322907 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 03:47:19.326922 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0321 03:47:19.326936 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 03:47:19.326989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 03:47:19.326999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 03:47:19.327009 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0321 03:47:19.327019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 03:47:19.327030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 03:47:19.327039 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 03:47:19.328267 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:18Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:48:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://182ad351b6c632ea64087d4784ea919d5c21165dc5d0373fa35db7f7f1eea435\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52df5756c4d3e8d1087a5a38b3e2b9966fb492d666c43773aa12a4cd83f6158d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52df5756c4d3e8d1087a5a38b3e2b9966fb492d666c43773aa12a4cd83f6158d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:46:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:46:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:46:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:42Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:42 crc kubenswrapper[4685]: I0321 03:48:42.471320 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cedbf5aa59177e67115d9dbf436805563022b6cebdd1c909d460d7997af5333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:42Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:42 crc kubenswrapper[4685]: I0321 03:48:42.491034 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-v9rdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fda9b1ff-e4a8-4d15-8f7b-2974991cd252\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrczq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrczq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-v9rdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:42Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:42 crc kubenswrapper[4685]: I0321 03:48:42.522942 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08dfc393-0ddb-4bde-9b1f-2a48549f4549\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ad07f9e8bc73e15364d59e6baf04c54550518537bac8cbfcf180b423a1da4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://beb9850d14f3c89154f1960550eed4073643c6b70832bb3958176ac924f5a9fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34752fc66f7ecfa5c0a9a761bc6a69d21ab1fa332ecd79948cb6b1ea34a8a8b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2215b819a38926d6f34fede86a32762a942f00bc552ca9c0b8df5f17ed47c10a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d89981c8d56dfb581289dc794c8402f5fde866fd6948c2c0a1940eb3ba99a320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://473ea0e8600e10ea2a695b213d69795c56c1a399cbc5dc82ddb34f7261a034e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c5056ea79326435f55aafe0fc9580a47c394a0d04559cd50b5eb720b2f599a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db2f167cea866dc3e14a9c0f6e343206b5e7d73d4c6971ddae722a4a29479cf1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T03:48:10Z\\\",\\\"message\\\":\\\".182158 6950 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0321 03:48:10.182201 6950 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0321 03:48:10.182262 6950 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0321 03:48:10.182349 6950 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0321 03:48:10.182419 6950 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0321 03:48:10.182472 6950 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0321 03:48:10.182494 6950 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0321 03:48:10.182550 6950 factory.go:656] Stopping watch factory\\\\nI0321 03:48:10.182584 6950 handler.go:208] Removed *v1.Node event handler 7\\\\nI0321 03:48:10.182225 6950 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0321 03:48:10.182774 6950 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0321 03:48:10.182795 6950 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0321 03:48:10.182807 6950 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0321 03:48:10.182867 6950 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0321 03:48:10.182891 6950 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T03:48:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86328f182f7db62c6133733a4a4b9171f479c5b134c11250bb1839234544f485\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cpfzk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:42Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:42 crc kubenswrapper[4685]: I0321 03:48:42.538778 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mlsb2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b79f01f-bf05-4f7d-b816-6ef01f21e949\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://415d35423a66454f5ed06ba8facb7adbda91fae370e4d1d8a022cf7002e7b8bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6tf6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mlsb2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:42Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:42 crc kubenswrapper[4685]: I0321 03:48:42.559078 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"591a6428-2384-44dd-826d-f6d2cf76c794\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b6e897d496eb8cc32a4f3c51a65335a9594f50c7010a0f022f54722edfd38e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fe44d24bf93a4fdf116cd039951b919f771648e1a8c53df1507569b821dc8aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://405513730b33c9e9981a2c99d2e2c5897042c1a395d9fc099ad6818a6352cb2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33920b49011815c1b030d555584aee220e8fede3b4f9ff5f8a5f554d6e1d8d02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33920b49011815c1b030d555584aee220e8fede3b4f9ff5f8a5f554d6e1d8d02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:46:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:46:09Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:46:08Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:42Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:42 crc kubenswrapper[4685]: I0321 03:48:42.582902 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74d074e3-02ec-4391-b150-07bea56db3c9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a85a4c157d29725897cd0fd271fa9844551cbb95dbc21e085684115987d8e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5af33f2ffea677fec9bcbaa1de7545651b1021c12e6ea778649d9b6b160b6b8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5af33f2ffea677fec9bcbaa1de7545651b1021c12e6ea778649d9b6b160b6b8e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:46:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:46:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:46:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:42Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:42 crc kubenswrapper[4685]: I0321 03:48:42.603981 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df1f675def99930733ba3d2e467e3c07307a00cd8a10f0c3dee668de18f92dbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:42Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:42 crc kubenswrapper[4685]: I0321 03:48:42.627472 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:42Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:42 crc kubenswrapper[4685]: I0321 03:48:42.647215 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:42Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:42 crc kubenswrapper[4685]: I0321 03:48:42.660676 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ztl6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dea34cf-d6c6-42fd-b4aa-8e175c6a78f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438ee4ce0e664eb23ee83b20eecbc81dd4f5ad12e50e98785be3696b58528952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sc2qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ztl6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:42Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:42 crc kubenswrapper[4685]: I0321 03:48:42.681305 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29e0ccb5-0299-4b2a-8138-694a1bb786db\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0be6102e44775f379b67814ed8f979bf81153bd68b519bc5c5e6cd2e3cb8169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://229992b297c1eb9aa6f92e56505bdf819e392fb777636759549854330bf022ab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T03:46:41Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0321 03:46:10.460102 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0321 03:46:10.462103 1 observer_polling.go:159] Starting file observer\\\\nI0321 03:46:10.493091 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0321 03:46:10.497028 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0321 03:46:40.990647 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T03:46:09Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa74801f196e57343a98011d671414abf1f1ebf4d7b962be522f0b6cad777acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41664fe1fb74ab3c38846eaf0a2b0bf46f5b0c4a2d2b3dfadb1a5e02e5a66e81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f2aa64984fe04680e1c00621e521e1a2a6eb5c2c2696c88fee33cc6eaa528f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:46:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:42Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:42 crc kubenswrapper[4685]: I0321 03:48:42.700737 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:42Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:42 crc kubenswrapper[4685]: I0321 03:48:42.719890 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jrr5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2521a678-ad6c-464b-bf7b-c4f6237c2822\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://749ea865e46b19e2475b94b696bb2be8bc27add3f2ba2420c38b9d67b2b8ddd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc29c132abc0c440dd96bf14118de27e46aa41322e61ce3d685eb43962330cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jrr5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:42Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:43 crc kubenswrapper[4685]: I0321 03:48:43.109659 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 03:48:43 crc kubenswrapper[4685]: I0321 03:48:43.109872 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 03:48:43 crc kubenswrapper[4685]: E0321 03:48:43.109984 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 03:49:47.109941228 +0000 UTC m=+219.587010060 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:48:43 crc kubenswrapper[4685]: E0321 03:48:43.110045 4685 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 21 03:48:43 crc kubenswrapper[4685]: E0321 03:48:43.110142 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 03:49:47.110121424 +0000 UTC m=+219.587190246 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 21 03:48:43 crc kubenswrapper[4685]: I0321 03:48:43.211551 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 03:48:43 crc kubenswrapper[4685]: I0321 03:48:43.211626 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 03:48:43 crc kubenswrapper[4685]: I0321 03:48:43.211668 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 03:48:43 crc kubenswrapper[4685]: I0321 03:48:43.211711 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fda9b1ff-e4a8-4d15-8f7b-2974991cd252-metrics-certs\") pod \"network-metrics-daemon-v9rdl\" (UID: \"fda9b1ff-e4a8-4d15-8f7b-2974991cd252\") " pod="openshift-multus/network-metrics-daemon-v9rdl" Mar 21 03:48:43 crc kubenswrapper[4685]: E0321 03:48:43.211792 4685 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 21 03:48:43 crc kubenswrapper[4685]: E0321 03:48:43.211879 4685 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 21 03:48:43 crc kubenswrapper[4685]: E0321 03:48:43.211923 4685 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 21 03:48:43 crc kubenswrapper[4685]: E0321 03:48:43.211943 4685 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 21 03:48:43 crc kubenswrapper[4685]: E0321 03:48:43.211959 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 03:49:47.211924189 +0000 UTC m=+219.688993011 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 21 03:48:43 crc kubenswrapper[4685]: E0321 03:48:43.211972 4685 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 03:48:43 crc kubenswrapper[4685]: E0321 03:48:43.212045 4685 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 21 03:48:43 crc kubenswrapper[4685]: E0321 03:48:43.212111 4685 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 21 03:48:43 crc kubenswrapper[4685]: E0321 03:48:43.212138 4685 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 03:48:43 crc kubenswrapper[4685]: E0321 03:48:43.212005 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fda9b1ff-e4a8-4d15-8f7b-2974991cd252-metrics-certs podName:fda9b1ff-e4a8-4d15-8f7b-2974991cd252 nodeName:}" failed. No retries permitted until 2026-03-21 03:49:47.211979061 +0000 UTC m=+219.689047883 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fda9b1ff-e4a8-4d15-8f7b-2974991cd252-metrics-certs") pod "network-metrics-daemon-v9rdl" (UID: "fda9b1ff-e4a8-4d15-8f7b-2974991cd252") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 21 03:48:43 crc kubenswrapper[4685]: E0321 03:48:43.212288 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-21 03:49:47.2122538 +0000 UTC m=+219.689322632 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 03:48:43 crc kubenswrapper[4685]: E0321 03:48:43.212321 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-21 03:49:47.212308181 +0000 UTC m=+219.689377013 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 03:48:43 crc kubenswrapper[4685]: I0321 03:48:43.300095 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v9rdl" Mar 21 03:48:43 crc kubenswrapper[4685]: I0321 03:48:43.300196 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 03:48:43 crc kubenswrapper[4685]: I0321 03:48:43.300257 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 03:48:43 crc kubenswrapper[4685]: I0321 03:48:43.300275 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 03:48:43 crc kubenswrapper[4685]: E0321 03:48:43.300490 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v9rdl" podUID="fda9b1ff-e4a8-4d15-8f7b-2974991cd252" Mar 21 03:48:43 crc kubenswrapper[4685]: E0321 03:48:43.300598 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 03:48:43 crc kubenswrapper[4685]: E0321 03:48:43.300809 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 03:48:43 crc kubenswrapper[4685]: E0321 03:48:43.301109 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 03:48:43 crc kubenswrapper[4685]: I0321 03:48:43.335674 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cpfzk_08dfc393-0ddb-4bde-9b1f-2a48549f4549/ovnkube-controller/3.log" Mar 21 03:48:43 crc kubenswrapper[4685]: I0321 03:48:43.337807 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cpfzk_08dfc393-0ddb-4bde-9b1f-2a48549f4549/ovnkube-controller/2.log" Mar 21 03:48:43 crc kubenswrapper[4685]: I0321 03:48:43.343494 4685 generic.go:334] "Generic (PLEG): container finished" podID="08dfc393-0ddb-4bde-9b1f-2a48549f4549" containerID="f4c5056ea79326435f55aafe0fc9580a47c394a0d04559cd50b5eb720b2f599a" exitCode=1 Mar 21 03:48:43 crc kubenswrapper[4685]: I0321 03:48:43.343586 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" event={"ID":"08dfc393-0ddb-4bde-9b1f-2a48549f4549","Type":"ContainerDied","Data":"f4c5056ea79326435f55aafe0fc9580a47c394a0d04559cd50b5eb720b2f599a"} Mar 21 03:48:43 crc kubenswrapper[4685]: I0321 03:48:43.343669 4685 scope.go:117] "RemoveContainer" containerID="db2f167cea866dc3e14a9c0f6e343206b5e7d73d4c6971ddae722a4a29479cf1" Mar 21 03:48:43 crc kubenswrapper[4685]: I0321 03:48:43.347880 4685 scope.go:117] "RemoveContainer" containerID="f4c5056ea79326435f55aafe0fc9580a47c394a0d04559cd50b5eb720b2f599a" Mar 21 03:48:43 crc kubenswrapper[4685]: E0321 03:48:43.348323 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-cpfzk_openshift-ovn-kubernetes(08dfc393-0ddb-4bde-9b1f-2a48549f4549)\"" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" podUID="08dfc393-0ddb-4bde-9b1f-2a48549f4549" Mar 21 03:48:43 crc kubenswrapper[4685]: I0321 03:48:43.371172 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7jcm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd9b1743-6b69-46d3-a429-6f83bf43317a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffa428e52a3c6324be6bace33035c1626678061a65e2badd389c6e93850ce25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a947393ce78ff4c9dddd93eb5ee7cb034673ebb6f0391cb868b473028074aa46\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T03:48:28Z\\\",\\\"message\\\":\\\"2026-03-21T03:47:42+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_20b06b61-cb28-4223-baf7-e69f86682674\\\\n2026-03-21T03:47:42+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_20b06b61-cb28-4223-baf7-e69f86682674 to /host/opt/cni/bin/\\\\n2026-03-21T03:47:43Z [verbose] multus-daemon started\\\\n2026-03-21T03:47:43Z [verbose] Readiness Indicator file check\\\\n2026-03-21T03:48:28Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:48:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x266z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7jcm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:43Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:43 crc kubenswrapper[4685]: I0321 03:48:43.393181 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cea46fe2-4e41-43ab-a069-cb30fb4e732c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f8af9aaf0940271c159c00b04ffb7a1dd9471caa93ad4c7bd4507461a095923\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5ntg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://682dda84970818843427fd441cf0359877cff12a6a10a7f7fad13d60c5ca62f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5ntg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7r9cg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:43Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:43 crc kubenswrapper[4685]: I0321 03:48:43.415188 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aab80bc7857e9a3d86f540c725f4e35d0d86f979abed1ea5c9cddb4a6263924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d981cec0491154ace10ce4f437d1101c8dd69c34129a42b7dd5af226f8b14b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:43Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:43 crc kubenswrapper[4685]: E0321 03:48:43.417831 4685 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 03:48:43 crc kubenswrapper[4685]: I0321 03:48:43.442813 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6xvsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5cb18bc-bda7-463e-98fe-6d8ff293b949\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b3a945a5915df9df43b15e02b30e3ffa4b0f0dd4e9283d54f80b4adb56f368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d8efc9f2ad837ea5b63d39bb04316943c7562b645f8f78b12e54d430f2a4a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4d8efc9f2ad837ea5b63d39bb04316943c7562b645f8f78b12e54d430f2a4a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d2cd7978a2b2d5a571096a1f0172b76414dadc75fb54395e05d1f344f04e1a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d2cd7978a2b2d5a571096a1f0172b76414dadc75fb54395e05d1f344f04e1a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2188c1bcb129415bc4e9a76339bad4f973c3778a1c5a0735b15a8e0ccc17f5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2188c1bcb129415bc4e9a76339bad4f973c3778a1c5a0735b15a8e0ccc17f5ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1417d49afce3f61158d9158a6c9a55e186a1032fb3c4ad275c63b6c9df81dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1417d49afce3f61158d9158a6c9a55e186a1032fb3c4ad275c63b6c9df81dd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f6759044b25748d81e11825c925bc83b0caa70bc3bc5f930554ac6f8a19cabb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f6759044b25748d81e11825c925bc83b0caa70bc3bc5f930554ac6f8a19cabb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6eda6a197766562ad75d8ccf23a1e8998c02f7b6f2ac62fb7cc1ff8384b3d0cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6eda6a197766562ad75d8ccf23a1e8998c02f7b6f2ac62fb7cc1ff8384b3d0cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6xvsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:43Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:43 crc kubenswrapper[4685]: I0321 03:48:43.463648 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cedbf5aa59177e67115d9dbf436805563022b6cebdd1c909d460d7997af5333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:43Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:43 crc kubenswrapper[4685]: I0321 03:48:43.485946 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fd9d618-b4ed-4942-b915-76dc59fb834a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba587c4fe2f05966282b50ba5236b9f3d9ef6de63f72c70ae9f7a5222cb8b904\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60c96edd458d05f217a2e9f07a44bd221303d821a790382a82cff0b912d48f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8583e9b5dff82d5df52e281ba4069e9259b1c8fe3d1b8121d0e9f3f9e97d47b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf0ba4af6f1f48fcb9ccf07dec53dc3ff1835a83dbf535bc48feb68fd646e78f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c896022c1e0726ccc5a54da9c432a0ff4082158ad76abcea60c0a1427c69134b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T03:47:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 03:47:19.136464 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 03:47:19.136735 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 03:47:19.138007 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2287411523/tls.crt::/tmp/serving-cert-2287411523/tls.key\\\\\\\"\\\\nI0321 03:47:19.320330 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 03:47:19.322862 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 03:47:19.322879 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 03:47:19.322902 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 03:47:19.322907 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 03:47:19.326922 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0321 03:47:19.326936 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 03:47:19.326989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 03:47:19.326999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 03:47:19.327009 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0321 03:47:19.327019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 03:47:19.327030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 03:47:19.327039 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 03:47:19.328267 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:18Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:48:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://182ad351b6c632ea64087d4784ea919d5c21165dc5d0373fa35db7f7f1eea435\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52df5756c4d3e8d1087a5a38b3e2b9966fb492d666c43773aa12a4cd83f6158d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52df5756c4d3e8d1087a5a38b3e2b9966fb492d666c43773aa12a4cd83f6158d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:46:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:46:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:46:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:43Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:43 crc kubenswrapper[4685]: I0321 03:48:43.508974 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df1f675def99930733ba3d2e467e3c07307a00cd8a10f0c3dee668de18f92dbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:43Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:43 crc kubenswrapper[4685]: I0321 03:48:43.530417 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:43Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:43 crc kubenswrapper[4685]: I0321 03:48:43.555228 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:43Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:43 crc kubenswrapper[4685]: I0321 03:48:43.572927 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-v9rdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fda9b1ff-e4a8-4d15-8f7b-2974991cd252\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrczq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrczq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-v9rdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:43Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:43 crc kubenswrapper[4685]: I0321 03:48:43.606969 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08dfc393-0ddb-4bde-9b1f-2a48549f4549\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ad07f9e8bc73e15364d59e6baf04c54550518537bac8cbfcf180b423a1da4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://beb9850d14f3c89154f1960550eed4073643c6b70832bb3958176ac924f5a9fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34752fc66f7ecfa5c0a9a761bc6a69d21ab1fa332ecd79948cb6b1ea34a8a8b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2215b819a38926d6f34fede86a32762a942f00bc552ca9c0b8df5f17ed47c10a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d89981c8d56dfb581289dc794c8402f5fde866fd6948c2c0a1940eb3ba99a320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://473ea0e8600e10ea2a695b213d69795c56c1a399cbc5dc82ddb34f7261a034e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c5056ea79326435f55aafe0fc9580a47c394a0d04559cd50b5eb720b2f599a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db2f167cea866dc3e14a9c0f6e343206b5e7d73d4c6971ddae722a4a29479cf1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T03:48:10Z\\\",\\\"message\\\":\\\".182158 6950 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0321 03:48:10.182201 6950 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0321 03:48:10.182262 6950 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0321 03:48:10.182349 6950 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0321 03:48:10.182419 6950 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0321 03:48:10.182472 6950 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0321 03:48:10.182494 6950 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0321 03:48:10.182550 6950 factory.go:656] Stopping watch factory\\\\nI0321 03:48:10.182584 6950 handler.go:208] Removed *v1.Node event handler 7\\\\nI0321 03:48:10.182225 6950 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0321 03:48:10.182774 6950 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0321 03:48:10.182795 6950 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0321 03:48:10.182807 6950 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0321 03:48:10.182867 6950 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0321 03:48:10.182891 6950 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T03:48:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4c5056ea79326435f55aafe0fc9580a47c394a0d04559cd50b5eb720b2f599a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T03:48:42Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0321 03:48:42.325409 7338 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI0321 03:48:42.325470 7338 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI0321 03:48:42.325499 7338 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI0321 03:48:42.325573 7338 factory.go:1336] Added *v1.Node event handler 7\\\\nI0321 03:48:42.325614 7338 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0321 03:48:42.326072 7338 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0321 03:48:42.326204 7338 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0321 03:48:42.326262 7338 ovnkube.go:599] Stopped ovnkube\\\\nI0321 03:48:42.326290 7338 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0321 03:48:42.326383 7338 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T03:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86328f182f7db62c6133733a4a4b9171f479c5b134c11250bb1839234544f485\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cpfzk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:43Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:43 crc kubenswrapper[4685]: I0321 03:48:43.623319 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mlsb2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b79f01f-bf05-4f7d-b816-6ef01f21e949\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://415d35423a66454f5ed06ba8facb7adbda91fae370e4d1d8a022cf7002e7b8bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6tf6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mlsb2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:43Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:43 crc kubenswrapper[4685]: I0321 03:48:43.638549 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"591a6428-2384-44dd-826d-f6d2cf76c794\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b6e897d496eb8cc32a4f3c51a65335a9594f50c7010a0f022f54722edfd38e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fe44d24bf93a4fdf116cd039951b919f771648e1a8c53df1507569b821dc8aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://405513730b33c9e9981a2c99d2e2c5897042c1a395d9fc099ad6818a6352cb2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33920b49011815c1b030d555584aee220e8fede3b4f9ff5f8a5f554d6e1d8d02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33920b49011815c1b030d555584aee220e8fede3b4f9ff5f8a5f554d6e1d8d02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:46:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:46:09Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:46:08Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:43Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:43 crc kubenswrapper[4685]: I0321 03:48:43.650160 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74d074e3-02ec-4391-b150-07bea56db3c9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a85a4c157d29725897cd0fd271fa9844551cbb95dbc21e085684115987d8e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5af33f2ffea677fec9bcbaa1de7545651b1021c12e6ea778649d9b6b160b6b8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5af33f2ffea677fec9bcbaa1de7545651b1021c12e6ea778649d9b6b160b6b8e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:46:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:46:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:46:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:43Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:43 crc kubenswrapper[4685]: I0321 03:48:43.662809 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ztl6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dea34cf-d6c6-42fd-b4aa-8e175c6a78f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438ee4ce0e664eb23ee83b20eecbc81dd4f5ad12e50e98785be3696b58528952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sc2qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ztl6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:43Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:43 crc kubenswrapper[4685]: I0321 03:48:43.674806 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jrr5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2521a678-ad6c-464b-bf7b-c4f6237c2822\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://749ea865e46b19e2475b94b696bb2be8bc27add3f2ba2420c38b9d67b2b8ddd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc29c132abc0c440dd96bf14118de27e46aa41322e61ce3d685eb43962330cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jrr5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:43Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:43 crc kubenswrapper[4685]: I0321 03:48:43.692205 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29e0ccb5-0299-4b2a-8138-694a1bb786db\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0be6102e44775f379b67814ed8f979bf81153bd68b519bc5c5e6cd2e3cb8169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://229992b297c1eb9aa6f92e56505bdf819e392fb777636759549854330bf022ab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T03:46:41Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0321 03:46:10.460102 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0321 03:46:10.462103 1 observer_polling.go:159] Starting file observer\\\\nI0321 03:46:10.493091 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0321 03:46:10.497028 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0321 03:46:40.990647 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T03:46:09Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa74801f196e57343a98011d671414abf1f1ebf4d7b962be522f0b6cad777acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41664fe1fb74ab3c38846eaf0a2b0bf46f5b0c4a2d2b3dfadb1a5e02e5a66e81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f2aa64984fe04680e1c00621e521e1a2a6eb5c2c2696c88fee33cc6eaa528f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:46:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:43Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:43 crc kubenswrapper[4685]: I0321 03:48:43.710540 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:43Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:44 crc kubenswrapper[4685]: I0321 03:48:44.348026 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cpfzk_08dfc393-0ddb-4bde-9b1f-2a48549f4549/ovnkube-controller/3.log" Mar 21 03:48:44 crc kubenswrapper[4685]: I0321 03:48:44.351140 4685 scope.go:117] "RemoveContainer" containerID="f4c5056ea79326435f55aafe0fc9580a47c394a0d04559cd50b5eb720b2f599a" Mar 21 03:48:44 crc kubenswrapper[4685]: E0321 03:48:44.351300 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-cpfzk_openshift-ovn-kubernetes(08dfc393-0ddb-4bde-9b1f-2a48549f4549)\"" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" podUID="08dfc393-0ddb-4bde-9b1f-2a48549f4549" Mar 21 03:48:44 crc kubenswrapper[4685]: I0321 03:48:44.365929 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fd9d618-b4ed-4942-b915-76dc59fb834a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba587c4fe2f05966282b50ba5236b9f3d9ef6de63f72c70ae9f7a5222cb8b904\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60c96edd458d05f217a2e9f07a44bd221303d821a790382a82cff0b912d48f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8583e9b5dff82d5df52e281ba4069e9259b1c8fe3d1b8121d0e9f3f9e97d47b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf0ba4af6f1f48fcb9ccf07dec53dc3ff1835a83dbf535bc48feb68fd646e78f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c896022c1e0726ccc5a54da9c432a0ff4082158ad76abcea60c0a1427c69134b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T03:47:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 03:47:19.136464 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 03:47:19.136735 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 03:47:19.138007 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2287411523/tls.crt::/tmp/serving-cert-2287411523/tls.key\\\\\\\"\\\\nI0321 03:47:19.320330 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 03:47:19.322862 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 03:47:19.322879 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 03:47:19.322902 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 03:47:19.322907 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 03:47:19.326922 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0321 03:47:19.326936 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 03:47:19.326989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 03:47:19.326999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 03:47:19.327009 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0321 03:47:19.327019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 03:47:19.327030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 03:47:19.327039 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 03:47:19.328267 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:18Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:48:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://182ad351b6c632ea64087d4784ea919d5c21165dc5d0373fa35db7f7f1eea435\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52df5756c4d3e8d1087a5a38b3e2b9966fb492d666c43773aa12a4cd83f6158d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52df5756c4d3e8d1087a5a38b3e2b9966fb492d666c43773aa12a4cd83f6158d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:46:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:46:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:46:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:44Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:44 crc kubenswrapper[4685]: I0321 03:48:44.383060 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cedbf5aa59177e67115d9dbf436805563022b6cebdd1c909d460d7997af5333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:44Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:44 crc kubenswrapper[4685]: I0321 03:48:44.396363 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74d074e3-02ec-4391-b150-07bea56db3c9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a85a4c157d29725897cd0fd271fa9844551cbb95dbc21e085684115987d8e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5af33f2ffea677fec9bcbaa1de7545651b1021c12e6ea778649d9b6b160b6b8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5af33f2ffea677fec9bcbaa1de7545651b1021c12e6ea778649d9b6b160b6b8e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:46:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:46:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:46:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:44Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:44 crc kubenswrapper[4685]: I0321 03:48:44.414743 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df1f675def99930733ba3d2e467e3c07307a00cd8a10f0c3dee668de18f92dbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:44Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:44 crc kubenswrapper[4685]: I0321 03:48:44.433771 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:44Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:44 crc kubenswrapper[4685]: I0321 03:48:44.474632 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:44Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:44 crc kubenswrapper[4685]: I0321 03:48:44.496095 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-v9rdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fda9b1ff-e4a8-4d15-8f7b-2974991cd252\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrczq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrczq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-v9rdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:44Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:44 crc kubenswrapper[4685]: I0321 03:48:44.515394 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08dfc393-0ddb-4bde-9b1f-2a48549f4549\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ad07f9e8bc73e15364d59e6baf04c54550518537bac8cbfcf180b423a1da4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://beb9850d14f3c89154f1960550eed4073643c6b70832bb3958176ac924f5a9fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34752fc66f7ecfa5c0a9a761bc6a69d21ab1fa332ecd79948cb6b1ea34a8a8b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2215b819a38926d6f34fede86a32762a942f00bc552ca9c0b8df5f17ed47c10a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d89981c8d56dfb581289dc794c8402f5fde866fd6948c2c0a1940eb3ba99a320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://473ea0e8600e10ea2a695b213d69795c56c1a399cbc5dc82ddb34f7261a034e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c5056ea79326435f55aafe0fc9580a47c394a0d04559cd50b5eb720b2f599a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4c5056ea79326435f55aafe0fc9580a47c394a0d04559cd50b5eb720b2f599a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T03:48:42Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0321 03:48:42.325409 7338 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI0321 03:48:42.325470 7338 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI0321 03:48:42.325499 7338 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI0321 03:48:42.325573 7338 factory.go:1336] Added *v1.Node event handler 7\\\\nI0321 03:48:42.325614 7338 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0321 03:48:42.326072 7338 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0321 03:48:42.326204 7338 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0321 03:48:42.326262 7338 ovnkube.go:599] Stopped ovnkube\\\\nI0321 03:48:42.326290 7338 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0321 03:48:42.326383 7338 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T03:48:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-cpfzk_openshift-ovn-kubernetes(08dfc393-0ddb-4bde-9b1f-2a48549f4549)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86328f182f7db62c6133733a4a4b9171f479c5b134c11250bb1839234544f485\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdrfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cpfzk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:44Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:44 crc kubenswrapper[4685]: I0321 03:48:44.524195 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mlsb2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b79f01f-bf05-4f7d-b816-6ef01f21e949\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://415d35423a66454f5ed06ba8facb7adbda91fae370e4d1d8a022cf7002e7b8bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6tf6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mlsb2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:44Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:44 crc kubenswrapper[4685]: I0321 03:48:44.534659 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"591a6428-2384-44dd-826d-f6d2cf76c794\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b6e897d496eb8cc32a4f3c51a65335a9594f50c7010a0f022f54722edfd38e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fe44d24bf93a4fdf116cd039951b919f771648e1a8c53df1507569b821dc8aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://405513730b33c9e9981a2c99d2e2c5897042c1a395d9fc099ad6818a6352cb2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33920b49011815c1b030d555584aee220e8fede3b4f9ff5f8a5f554d6e1d8d02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33920b49011815c1b030d555584aee220e8fede3b4f9ff5f8a5f554d6e1d8d02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:46:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:46:09Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:46:08Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:44Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:44 crc kubenswrapper[4685]: I0321 03:48:44.544801 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ztl6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dea34cf-d6c6-42fd-b4aa-8e175c6a78f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438ee4ce0e664eb23ee83b20eecbc81dd4f5ad12e50e98785be3696b58528952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sc2qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ztl6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:44Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:44 crc kubenswrapper[4685]: I0321 03:48:44.555460 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:44Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:44 crc kubenswrapper[4685]: I0321 03:48:44.565461 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jrr5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2521a678-ad6c-464b-bf7b-c4f6237c2822\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://749ea865e46b19e2475b94b696bb2be8bc27add3f2ba2420c38b9d67b2b8ddd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc29c132abc0c440dd96bf14118de27e46aa41322e61ce3d685eb43962330cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jrr5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:44Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:44 crc kubenswrapper[4685]: I0321 03:48:44.575973 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29e0ccb5-0299-4b2a-8138-694a1bb786db\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:46:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0be6102e44775f379b67814ed8f979bf81153bd68b519bc5c5e6cd2e3cb8169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://229992b297c1eb9aa6f92e56505bdf819e392fb777636759549854330bf022ab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T03:46:41Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0321 03:46:10.460102 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0321 03:46:10.462103 1 observer_polling.go:159] Starting file observer\\\\nI0321 03:46:10.493091 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0321 03:46:10.497028 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0321 03:46:40.990647 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T03:46:09Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa74801f196e57343a98011d671414abf1f1ebf4d7b962be522f0b6cad777acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41664fe1fb74ab3c38846eaf0a2b0bf46f5b0c4a2d2b3dfadb1a5e02e5a66e81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f2aa64984fe04680e1c00621e521e1a2a6eb5c2c2696c88fee33cc6eaa528f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:46:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:44Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:44 crc kubenswrapper[4685]: I0321 03:48:44.589543 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6xvsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5cb18bc-bda7-463e-98fe-6d8ff293b949\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b3a945a5915df9df43b15e02b30e3ffa4b0f0dd4e9283d54f80b4adb56f368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d8efc9f2ad837ea5b63d39bb04316943c7562b645f8f78b12e54d430f2a4a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4d8efc9f2ad837ea5b63d39bb04316943c7562b645f8f78b12e54d430f2a4a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d2cd7978a2b2d5a571096a1f0172b76414dadc75fb54395e05d1f344f04e1a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d2cd7978a2b2d5a571096a1f0172b76414dadc75fb54395e05d1f344f04e1a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2188c1bcb129415bc4e9a76339bad4f973c3778a1c5a0735b15a8e0ccc17f5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2188c1bcb129415bc4e9a76339bad4f973c3778a1c5a0735b15a8e0ccc17f5ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1417d49afce3f61158d9158a6c9a55e186a1032fb3c4ad275c63b6c9df81dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1417d49afce3f61158d9158a6c9a55e186a1032fb3c4ad275c63b6c9df81dd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f6759044b25748d81e11825c925bc83b0caa70bc3bc5f930554ac6f8a19cabb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f6759044b25748d81e11825c925bc83b0caa70bc3bc5f930554ac6f8a19cabb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6eda6a197766562ad75d8ccf23a1e8998c02f7b6f2ac62fb7cc1ff8384b3d0cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6eda6a197766562ad75d8ccf23a1e8998c02f7b6f2ac62fb7cc1ff8384b3d0cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T03:47:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngpjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6xvsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:44Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:44 crc kubenswrapper[4685]: I0321 03:48:44.601876 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7jcm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd9b1743-6b69-46d3-a429-6f83bf43317a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:48:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffa428e52a3c6324be6bace33035c1626678061a65e2badd389c6e93850ce25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a947393ce78ff4c9dddd93eb5ee7cb034673ebb6f0391cb868b473028074aa46\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T03:48:28Z\\\",\\\"message\\\":\\\"2026-03-21T03:47:42+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_20b06b61-cb28-4223-baf7-e69f86682674\\\\n2026-03-21T03:47:42+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_20b06b61-cb28-4223-baf7-e69f86682674 to /host/opt/cni/bin/\\\\n2026-03-21T03:47:43Z [verbose] multus-daemon started\\\\n2026-03-21T03:47:43Z [verbose] Readiness Indicator file check\\\\n2026-03-21T03:48:28Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:48:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x266z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7jcm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:44Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:44 crc kubenswrapper[4685]: I0321 03:48:44.613188 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cea46fe2-4e41-43ab-a069-cb30fb4e732c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f8af9aaf0940271c159c00b04ffb7a1dd9471caa93ad4c7bd4507461a095923\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5ntg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://682dda84970818843427fd441cf0359877cff12a6a10a7f7fad13d60c5ca62f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5ntg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T03:47:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7r9cg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:44Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:44 crc kubenswrapper[4685]: I0321 03:48:44.623876 4685 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T03:47:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aab80bc7857e9a3d86f540c725f4e35d0d86f979abed1ea5c9cddb4a6263924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d981cec0491154ace10ce4f437d1101c8dd69c34129a42b7dd5af226f8b14b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T03:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T03:48:44Z is after 2025-08-24T17:21:41Z" Mar 21 03:48:45 crc kubenswrapper[4685]: I0321 03:48:45.300616 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 03:48:45 crc kubenswrapper[4685]: I0321 03:48:45.300661 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 03:48:45 crc kubenswrapper[4685]: I0321 03:48:45.300626 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 03:48:45 crc kubenswrapper[4685]: E0321 03:48:45.300744 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 03:48:45 crc kubenswrapper[4685]: I0321 03:48:45.300822 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v9rdl" Mar 21 03:48:45 crc kubenswrapper[4685]: E0321 03:48:45.301033 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v9rdl" podUID="fda9b1ff-e4a8-4d15-8f7b-2974991cd252" Mar 21 03:48:45 crc kubenswrapper[4685]: E0321 03:48:45.301018 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 03:48:45 crc kubenswrapper[4685]: E0321 03:48:45.301103 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 03:48:47 crc kubenswrapper[4685]: I0321 03:48:47.300881 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 03:48:47 crc kubenswrapper[4685]: I0321 03:48:47.300950 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 03:48:47 crc kubenswrapper[4685]: I0321 03:48:47.300953 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v9rdl" Mar 21 03:48:47 crc kubenswrapper[4685]: I0321 03:48:47.300893 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 03:48:47 crc kubenswrapper[4685]: E0321 03:48:47.301096 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 03:48:47 crc kubenswrapper[4685]: E0321 03:48:47.301199 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 03:48:47 crc kubenswrapper[4685]: E0321 03:48:47.301428 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 03:48:47 crc kubenswrapper[4685]: E0321 03:48:47.301606 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v9rdl" podUID="fda9b1ff-e4a8-4d15-8f7b-2974991cd252" Mar 21 03:48:48 crc kubenswrapper[4685]: I0321 03:48:48.360612 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" podStartSLOduration=118.360578391 podStartE2EDuration="1m58.360578391s" podCreationTimestamp="2026-03-21 03:46:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 03:48:48.338215185 +0000 UTC m=+160.815284017" watchObservedRunningTime="2026-03-21 03:48:48.360578391 +0000 UTC m=+160.837647223" Mar 21 03:48:48 crc kubenswrapper[4685]: E0321 03:48:48.418485 4685 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 03:48:48 crc kubenswrapper[4685]: I0321 03:48:48.426047 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-7jcm2" podStartSLOduration=118.426018409 podStartE2EDuration="1m58.426018409s" podCreationTimestamp="2026-03-21 03:46:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 03:48:48.425775951 +0000 UTC m=+160.902844803" watchObservedRunningTime="2026-03-21 03:48:48.426018409 +0000 UTC m=+160.903087241" Mar 21 03:48:48 crc kubenswrapper[4685]: I0321 03:48:48.426358 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-6xvsf" podStartSLOduration=118.426346489 podStartE2EDuration="1m58.426346489s" podCreationTimestamp="2026-03-21 03:46:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 03:48:48.399669466 +0000 UTC m=+160.876738338" watchObservedRunningTime="2026-03-21 03:48:48.426346489 +0000 UTC m=+160.903415311" Mar 21 03:48:48 crc kubenswrapper[4685]: I0321 03:48:48.454877 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=57.454809688 podStartE2EDuration="57.454809688s" podCreationTimestamp="2026-03-21 03:47:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 03:48:48.454195859 +0000 UTC m=+160.931264661" watchObservedRunningTime="2026-03-21 03:48:48.454809688 +0000 UTC m=+160.931878520" Mar 21 03:48:48 crc kubenswrapper[4685]: I0321 03:48:48.599053 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-mlsb2" podStartSLOduration=118.599015183 podStartE2EDuration="1m58.599015183s" podCreationTimestamp="2026-03-21 03:46:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 03:48:48.579320591 +0000 UTC m=+161.056389393" watchObservedRunningTime="2026-03-21 03:48:48.599015183 +0000 UTC m=+161.076084005" Mar 21 03:48:48 crc kubenswrapper[4685]: I0321 03:48:48.600057 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=30.600041526 podStartE2EDuration="30.600041526s" podCreationTimestamp="2026-03-21 03:48:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 03:48:48.5986044 +0000 UTC m=+161.075673202" watchObservedRunningTime="2026-03-21 03:48:48.600041526 +0000 UTC m=+161.077110368" Mar 21 03:48:48 crc kubenswrapper[4685]: I0321 03:48:48.611890 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=24.611870589 podStartE2EDuration="24.611870589s" podCreationTimestamp="2026-03-21 03:48:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 03:48:48.611663583 +0000 UTC m=+161.088732415" watchObservedRunningTime="2026-03-21 03:48:48.611870589 +0000 UTC m=+161.088939391" Mar 21 03:48:48 crc kubenswrapper[4685]: I0321 03:48:48.643495 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-ztl6v" podStartSLOduration=118.643474768 podStartE2EDuration="1m58.643474768s" podCreationTimestamp="2026-03-21 03:46:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 03:48:48.643272661 +0000 UTC m=+161.120341473" watchObservedRunningTime="2026-03-21 03:48:48.643474768 +0000 UTC m=+161.120543570" Mar 21 03:48:48 crc kubenswrapper[4685]: I0321 03:48:48.661694 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=42.661674322 podStartE2EDuration="42.661674322s" podCreationTimestamp="2026-03-21 03:48:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 03:48:48.660705812 +0000 UTC m=+161.137774614" watchObservedRunningTime="2026-03-21 03:48:48.661674322 +0000 UTC m=+161.138743114" Mar 21 03:48:48 crc kubenswrapper[4685]: I0321 03:48:48.690134 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jrr5c" podStartSLOduration=118.690118121 podStartE2EDuration="1m58.690118121s" podCreationTimestamp="2026-03-21 03:46:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 03:48:48.689307495 +0000 UTC m=+161.166376337" watchObservedRunningTime="2026-03-21 03:48:48.690118121 +0000 UTC m=+161.167186923" Mar 21 03:48:49 crc kubenswrapper[4685]: I0321 03:48:49.300400 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v9rdl" Mar 21 03:48:49 crc kubenswrapper[4685]: I0321 03:48:49.300444 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 03:48:49 crc kubenswrapper[4685]: I0321 03:48:49.300499 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 03:48:49 crc kubenswrapper[4685]: E0321 03:48:49.301158 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 03:48:49 crc kubenswrapper[4685]: I0321 03:48:49.300498 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 03:48:49 crc kubenswrapper[4685]: E0321 03:48:49.301001 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 03:48:49 crc kubenswrapper[4685]: E0321 03:48:49.301455 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 03:48:49 crc kubenswrapper[4685]: E0321 03:48:49.301826 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v9rdl" podUID="fda9b1ff-e4a8-4d15-8f7b-2974991cd252" Mar 21 03:48:51 crc kubenswrapper[4685]: I0321 03:48:51.300027 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v9rdl" Mar 21 03:48:51 crc kubenswrapper[4685]: I0321 03:48:51.300055 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 03:48:51 crc kubenswrapper[4685]: I0321 03:48:51.300079 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 03:48:51 crc kubenswrapper[4685]: I0321 03:48:51.300217 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 03:48:51 crc kubenswrapper[4685]: E0321 03:48:51.300396 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v9rdl" podUID="fda9b1ff-e4a8-4d15-8f7b-2974991cd252" Mar 21 03:48:51 crc kubenswrapper[4685]: E0321 03:48:51.300575 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 03:48:51 crc kubenswrapper[4685]: E0321 03:48:51.300725 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 03:48:51 crc kubenswrapper[4685]: E0321 03:48:51.300901 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 03:48:51 crc kubenswrapper[4685]: I0321 03:48:51.614215 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 03:48:51 crc kubenswrapper[4685]: I0321 03:48:51.614273 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 03:48:51 crc kubenswrapper[4685]: I0321 03:48:51.614291 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 03:48:51 crc kubenswrapper[4685]: I0321 03:48:51.614315 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 03:48:51 crc kubenswrapper[4685]: I0321 03:48:51.614333 4685 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T03:48:51Z","lastTransitionTime":"2026-03-21T03:48:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 03:48:51 crc kubenswrapper[4685]: I0321 03:48:51.681129 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-c2psp"] Mar 21 03:48:51 crc kubenswrapper[4685]: I0321 03:48:51.681518 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c2psp" Mar 21 03:48:51 crc kubenswrapper[4685]: I0321 03:48:51.685762 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 21 03:48:51 crc kubenswrapper[4685]: I0321 03:48:51.686491 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 21 03:48:51 crc kubenswrapper[4685]: I0321 03:48:51.686502 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 21 03:48:51 crc kubenswrapper[4685]: I0321 03:48:51.687407 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 21 03:48:51 crc kubenswrapper[4685]: I0321 03:48:51.702646 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0f4c19ff-6f4a-438f-813c-3860ba0ce2b2-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-c2psp\" (UID: \"0f4c19ff-6f4a-438f-813c-3860ba0ce2b2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c2psp" Mar 21 03:48:51 crc kubenswrapper[4685]: I0321 03:48:51.702788 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/0f4c19ff-6f4a-438f-813c-3860ba0ce2b2-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-c2psp\" (UID: \"0f4c19ff-6f4a-438f-813c-3860ba0ce2b2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c2psp" Mar 21 03:48:51 crc kubenswrapper[4685]: I0321 03:48:51.702951 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f4c19ff-6f4a-438f-813c-3860ba0ce2b2-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-c2psp\" (UID: \"0f4c19ff-6f4a-438f-813c-3860ba0ce2b2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c2psp" Mar 21 03:48:51 crc kubenswrapper[4685]: I0321 03:48:51.703048 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/0f4c19ff-6f4a-438f-813c-3860ba0ce2b2-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-c2psp\" (UID: \"0f4c19ff-6f4a-438f-813c-3860ba0ce2b2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c2psp" Mar 21 03:48:51 crc kubenswrapper[4685]: I0321 03:48:51.703113 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0f4c19ff-6f4a-438f-813c-3860ba0ce2b2-service-ca\") pod \"cluster-version-operator-5c965bbfc6-c2psp\" (UID: \"0f4c19ff-6f4a-438f-813c-3860ba0ce2b2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c2psp" Mar 21 03:48:51 crc kubenswrapper[4685]: I0321 03:48:51.804067 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/0f4c19ff-6f4a-438f-813c-3860ba0ce2b2-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-c2psp\" (UID: \"0f4c19ff-6f4a-438f-813c-3860ba0ce2b2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c2psp" Mar 21 03:48:51 crc kubenswrapper[4685]: I0321 03:48:51.804149 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0f4c19ff-6f4a-438f-813c-3860ba0ce2b2-service-ca\") pod \"cluster-version-operator-5c965bbfc6-c2psp\" (UID: \"0f4c19ff-6f4a-438f-813c-3860ba0ce2b2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c2psp" Mar 21 03:48:51 crc kubenswrapper[4685]: I0321 03:48:51.804218 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/0f4c19ff-6f4a-438f-813c-3860ba0ce2b2-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-c2psp\" (UID: \"0f4c19ff-6f4a-438f-813c-3860ba0ce2b2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c2psp" Mar 21 03:48:51 crc kubenswrapper[4685]: I0321 03:48:51.804260 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0f4c19ff-6f4a-438f-813c-3860ba0ce2b2-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-c2psp\" (UID: \"0f4c19ff-6f4a-438f-813c-3860ba0ce2b2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c2psp" Mar 21 03:48:51 crc kubenswrapper[4685]: I0321 03:48:51.804344 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/0f4c19ff-6f4a-438f-813c-3860ba0ce2b2-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-c2psp\" (UID: \"0f4c19ff-6f4a-438f-813c-3860ba0ce2b2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c2psp" Mar 21 03:48:51 crc kubenswrapper[4685]: I0321 03:48:51.804426 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f4c19ff-6f4a-438f-813c-3860ba0ce2b2-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-c2psp\" (UID: \"0f4c19ff-6f4a-438f-813c-3860ba0ce2b2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c2psp" Mar 21 03:48:51 crc kubenswrapper[4685]: I0321 03:48:51.804501 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/0f4c19ff-6f4a-438f-813c-3860ba0ce2b2-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-c2psp\" (UID: \"0f4c19ff-6f4a-438f-813c-3860ba0ce2b2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c2psp" Mar 21 03:48:51 crc kubenswrapper[4685]: I0321 03:48:51.805635 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0f4c19ff-6f4a-438f-813c-3860ba0ce2b2-service-ca\") pod \"cluster-version-operator-5c965bbfc6-c2psp\" (UID: \"0f4c19ff-6f4a-438f-813c-3860ba0ce2b2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c2psp" Mar 21 03:48:51 crc kubenswrapper[4685]: I0321 03:48:51.811232 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f4c19ff-6f4a-438f-813c-3860ba0ce2b2-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-c2psp\" (UID: \"0f4c19ff-6f4a-438f-813c-3860ba0ce2b2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c2psp" Mar 21 03:48:51 crc kubenswrapper[4685]: I0321 03:48:51.822328 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0f4c19ff-6f4a-438f-813c-3860ba0ce2b2-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-c2psp\" (UID: \"0f4c19ff-6f4a-438f-813c-3860ba0ce2b2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c2psp" Mar 21 03:48:51 crc kubenswrapper[4685]: I0321 03:48:51.996548 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c2psp" Mar 21 03:48:52 crc kubenswrapper[4685]: I0321 03:48:52.324301 4685 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 21 03:48:52 crc kubenswrapper[4685]: I0321 03:48:52.334428 4685 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 21 03:48:52 crc kubenswrapper[4685]: I0321 03:48:52.387674 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c2psp" event={"ID":"0f4c19ff-6f4a-438f-813c-3860ba0ce2b2","Type":"ContainerStarted","Data":"656d65b501db9d0f347edb33e68ca80b039c94140fbff1204c7bda00daa22768"} Mar 21 03:48:52 crc kubenswrapper[4685]: I0321 03:48:52.387724 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c2psp" event={"ID":"0f4c19ff-6f4a-438f-813c-3860ba0ce2b2","Type":"ContainerStarted","Data":"3c4959badd5254ad0bbf8e8eb8ee79a58ea532a52b98de28640d0fb9d2d952b0"} Mar 21 03:48:52 crc kubenswrapper[4685]: I0321 03:48:52.416505 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c2psp" podStartSLOduration=122.416475197 podStartE2EDuration="2m2.416475197s" podCreationTimestamp="2026-03-21 03:46:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 03:48:52.405330515 +0000 UTC m=+164.882399307" watchObservedRunningTime="2026-03-21 03:48:52.416475197 +0000 UTC m=+164.893544019" Mar 21 03:48:53 crc kubenswrapper[4685]: I0321 03:48:53.300590 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v9rdl" Mar 21 03:48:53 crc kubenswrapper[4685]: E0321 03:48:53.301163 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v9rdl" podUID="fda9b1ff-e4a8-4d15-8f7b-2974991cd252" Mar 21 03:48:53 crc kubenswrapper[4685]: I0321 03:48:53.301490 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 03:48:53 crc kubenswrapper[4685]: E0321 03:48:53.301618 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 03:48:53 crc kubenswrapper[4685]: I0321 03:48:53.301959 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 03:48:53 crc kubenswrapper[4685]: E0321 03:48:53.302113 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 03:48:53 crc kubenswrapper[4685]: I0321 03:48:53.302338 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 03:48:53 crc kubenswrapper[4685]: E0321 03:48:53.302442 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 03:48:53 crc kubenswrapper[4685]: E0321 03:48:53.419359 4685 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 03:48:54 crc kubenswrapper[4685]: I0321 03:48:54.316047 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 21 03:48:55 crc kubenswrapper[4685]: I0321 03:48:55.300577 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 03:48:55 crc kubenswrapper[4685]: I0321 03:48:55.300639 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v9rdl" Mar 21 03:48:55 crc kubenswrapper[4685]: E0321 03:48:55.300756 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 03:48:55 crc kubenswrapper[4685]: I0321 03:48:55.300599 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 03:48:55 crc kubenswrapper[4685]: I0321 03:48:55.300870 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 03:48:55 crc kubenswrapper[4685]: E0321 03:48:55.301066 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v9rdl" podUID="fda9b1ff-e4a8-4d15-8f7b-2974991cd252" Mar 21 03:48:55 crc kubenswrapper[4685]: E0321 03:48:55.301193 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 03:48:55 crc kubenswrapper[4685]: E0321 03:48:55.301308 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 03:48:56 crc kubenswrapper[4685]: I0321 03:48:56.301474 4685 scope.go:117] "RemoveContainer" containerID="f4c5056ea79326435f55aafe0fc9580a47c394a0d04559cd50b5eb720b2f599a" Mar 21 03:48:56 crc kubenswrapper[4685]: E0321 03:48:56.301784 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-cpfzk_openshift-ovn-kubernetes(08dfc393-0ddb-4bde-9b1f-2a48549f4549)\"" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" podUID="08dfc393-0ddb-4bde-9b1f-2a48549f4549" Mar 21 03:48:57 crc kubenswrapper[4685]: I0321 03:48:57.300639 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 03:48:57 crc kubenswrapper[4685]: I0321 03:48:57.300735 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 03:48:57 crc kubenswrapper[4685]: E0321 03:48:57.300748 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 03:48:57 crc kubenswrapper[4685]: I0321 03:48:57.300833 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 03:48:57 crc kubenswrapper[4685]: I0321 03:48:57.300936 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v9rdl" Mar 21 03:48:57 crc kubenswrapper[4685]: E0321 03:48:57.301094 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 03:48:57 crc kubenswrapper[4685]: E0321 03:48:57.301400 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 03:48:57 crc kubenswrapper[4685]: E0321 03:48:57.301514 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v9rdl" podUID="fda9b1ff-e4a8-4d15-8f7b-2974991cd252" Mar 21 03:48:58 crc kubenswrapper[4685]: I0321 03:48:58.351399 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=4.351379817 podStartE2EDuration="4.351379817s" podCreationTimestamp="2026-03-21 03:48:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 03:48:58.346266485 +0000 UTC m=+170.823335287" watchObservedRunningTime="2026-03-21 03:48:58.351379817 +0000 UTC m=+170.828448609" Mar 21 03:48:58 crc kubenswrapper[4685]: E0321 03:48:58.419729 4685 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 03:48:59 crc kubenswrapper[4685]: I0321 03:48:59.300402 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v9rdl" Mar 21 03:48:59 crc kubenswrapper[4685]: I0321 03:48:59.300491 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 03:48:59 crc kubenswrapper[4685]: I0321 03:48:59.300495 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 03:48:59 crc kubenswrapper[4685]: I0321 03:48:59.300425 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 03:48:59 crc kubenswrapper[4685]: E0321 03:48:59.300577 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v9rdl" podUID="fda9b1ff-e4a8-4d15-8f7b-2974991cd252" Mar 21 03:48:59 crc kubenswrapper[4685]: E0321 03:48:59.300660 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 03:48:59 crc kubenswrapper[4685]: E0321 03:48:59.300935 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 03:48:59 crc kubenswrapper[4685]: E0321 03:48:59.301037 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 03:49:01 crc kubenswrapper[4685]: I0321 03:49:01.300418 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v9rdl" Mar 21 03:49:01 crc kubenswrapper[4685]: E0321 03:49:01.300565 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v9rdl" podUID="fda9b1ff-e4a8-4d15-8f7b-2974991cd252" Mar 21 03:49:01 crc kubenswrapper[4685]: I0321 03:49:01.300435 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 03:49:01 crc kubenswrapper[4685]: I0321 03:49:01.300411 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 03:49:01 crc kubenswrapper[4685]: I0321 03:49:01.300618 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 03:49:01 crc kubenswrapper[4685]: E0321 03:49:01.300645 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 03:49:01 crc kubenswrapper[4685]: E0321 03:49:01.300763 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 03:49:01 crc kubenswrapper[4685]: E0321 03:49:01.300910 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 03:49:03 crc kubenswrapper[4685]: I0321 03:49:03.300043 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v9rdl" Mar 21 03:49:03 crc kubenswrapper[4685]: I0321 03:49:03.300139 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 03:49:03 crc kubenswrapper[4685]: I0321 03:49:03.300058 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 03:49:03 crc kubenswrapper[4685]: E0321 03:49:03.300353 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 03:49:03 crc kubenswrapper[4685]: E0321 03:49:03.300479 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 03:49:03 crc kubenswrapper[4685]: E0321 03:49:03.300506 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v9rdl" podUID="fda9b1ff-e4a8-4d15-8f7b-2974991cd252" Mar 21 03:49:03 crc kubenswrapper[4685]: I0321 03:49:03.301056 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 03:49:03 crc kubenswrapper[4685]: E0321 03:49:03.301274 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 03:49:03 crc kubenswrapper[4685]: E0321 03:49:03.421608 4685 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 03:49:05 crc kubenswrapper[4685]: I0321 03:49:05.300281 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 03:49:05 crc kubenswrapper[4685]: I0321 03:49:05.300408 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v9rdl" Mar 21 03:49:05 crc kubenswrapper[4685]: E0321 03:49:05.300482 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 03:49:05 crc kubenswrapper[4685]: I0321 03:49:05.300519 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 03:49:05 crc kubenswrapper[4685]: I0321 03:49:05.300585 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 03:49:05 crc kubenswrapper[4685]: E0321 03:49:05.300739 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v9rdl" podUID="fda9b1ff-e4a8-4d15-8f7b-2974991cd252" Mar 21 03:49:05 crc kubenswrapper[4685]: E0321 03:49:05.300806 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 03:49:05 crc kubenswrapper[4685]: E0321 03:49:05.300988 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 03:49:07 crc kubenswrapper[4685]: I0321 03:49:07.300481 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 03:49:07 crc kubenswrapper[4685]: E0321 03:49:07.300653 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 03:49:07 crc kubenswrapper[4685]: I0321 03:49:07.300755 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v9rdl" Mar 21 03:49:07 crc kubenswrapper[4685]: I0321 03:49:07.300908 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 03:49:07 crc kubenswrapper[4685]: E0321 03:49:07.301096 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v9rdl" podUID="fda9b1ff-e4a8-4d15-8f7b-2974991cd252" Mar 21 03:49:07 crc kubenswrapper[4685]: I0321 03:49:07.301304 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 03:49:07 crc kubenswrapper[4685]: E0321 03:49:07.301432 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 03:49:07 crc kubenswrapper[4685]: E0321 03:49:07.301291 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 03:49:08 crc kubenswrapper[4685]: E0321 03:49:08.422354 4685 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 03:49:09 crc kubenswrapper[4685]: I0321 03:49:09.300441 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v9rdl" Mar 21 03:49:09 crc kubenswrapper[4685]: I0321 03:49:09.300496 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 03:49:09 crc kubenswrapper[4685]: I0321 03:49:09.300531 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 03:49:09 crc kubenswrapper[4685]: E0321 03:49:09.300624 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v9rdl" podUID="fda9b1ff-e4a8-4d15-8f7b-2974991cd252" Mar 21 03:49:09 crc kubenswrapper[4685]: I0321 03:49:09.300701 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 03:49:09 crc kubenswrapper[4685]: E0321 03:49:09.300911 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 03:49:09 crc kubenswrapper[4685]: E0321 03:49:09.300969 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 03:49:09 crc kubenswrapper[4685]: E0321 03:49:09.301074 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 03:49:11 crc kubenswrapper[4685]: I0321 03:49:11.300265 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 03:49:11 crc kubenswrapper[4685]: I0321 03:49:11.300474 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v9rdl" Mar 21 03:49:11 crc kubenswrapper[4685]: I0321 03:49:11.300697 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 03:49:11 crc kubenswrapper[4685]: E0321 03:49:11.300671 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 03:49:11 crc kubenswrapper[4685]: I0321 03:49:11.300762 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 03:49:11 crc kubenswrapper[4685]: E0321 03:49:11.301019 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 03:49:11 crc kubenswrapper[4685]: E0321 03:49:11.301148 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 03:49:11 crc kubenswrapper[4685]: E0321 03:49:11.301518 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v9rdl" podUID="fda9b1ff-e4a8-4d15-8f7b-2974991cd252" Mar 21 03:49:11 crc kubenswrapper[4685]: I0321 03:49:11.302645 4685 scope.go:117] "RemoveContainer" containerID="f4c5056ea79326435f55aafe0fc9580a47c394a0d04559cd50b5eb720b2f599a" Mar 21 03:49:11 crc kubenswrapper[4685]: E0321 03:49:11.303075 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-cpfzk_openshift-ovn-kubernetes(08dfc393-0ddb-4bde-9b1f-2a48549f4549)\"" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" podUID="08dfc393-0ddb-4bde-9b1f-2a48549f4549" Mar 21 03:49:13 crc kubenswrapper[4685]: I0321 03:49:13.300484 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 03:49:13 crc kubenswrapper[4685]: I0321 03:49:13.300558 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 03:49:13 crc kubenswrapper[4685]: E0321 03:49:13.300956 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 03:49:13 crc kubenswrapper[4685]: I0321 03:49:13.300585 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v9rdl" Mar 21 03:49:13 crc kubenswrapper[4685]: I0321 03:49:13.300558 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 03:49:13 crc kubenswrapper[4685]: E0321 03:49:13.301111 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 03:49:13 crc kubenswrapper[4685]: E0321 03:49:13.301319 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v9rdl" podUID="fda9b1ff-e4a8-4d15-8f7b-2974991cd252" Mar 21 03:49:13 crc kubenswrapper[4685]: E0321 03:49:13.301394 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 03:49:13 crc kubenswrapper[4685]: E0321 03:49:13.424264 4685 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 03:49:15 crc kubenswrapper[4685]: I0321 03:49:15.301261 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 03:49:15 crc kubenswrapper[4685]: I0321 03:49:15.301405 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 03:49:15 crc kubenswrapper[4685]: I0321 03:49:15.301493 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v9rdl" Mar 21 03:49:15 crc kubenswrapper[4685]: E0321 03:49:15.301528 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 03:49:15 crc kubenswrapper[4685]: E0321 03:49:15.301739 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v9rdl" podUID="fda9b1ff-e4a8-4d15-8f7b-2974991cd252" Mar 21 03:49:15 crc kubenswrapper[4685]: E0321 03:49:15.301947 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 03:49:15 crc kubenswrapper[4685]: I0321 03:49:15.301298 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 03:49:15 crc kubenswrapper[4685]: E0321 03:49:15.303563 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 03:49:15 crc kubenswrapper[4685]: I0321 03:49:15.469592 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7jcm2_cd9b1743-6b69-46d3-a429-6f83bf43317a/kube-multus/1.log" Mar 21 03:49:15 crc kubenswrapper[4685]: I0321 03:49:15.470422 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7jcm2_cd9b1743-6b69-46d3-a429-6f83bf43317a/kube-multus/0.log" Mar 21 03:49:15 crc kubenswrapper[4685]: I0321 03:49:15.470467 4685 generic.go:334] "Generic (PLEG): container finished" podID="cd9b1743-6b69-46d3-a429-6f83bf43317a" containerID="ffa428e52a3c6324be6bace33035c1626678061a65e2badd389c6e93850ce25f" exitCode=1 Mar 21 03:49:15 crc kubenswrapper[4685]: I0321 03:49:15.470497 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7jcm2" event={"ID":"cd9b1743-6b69-46d3-a429-6f83bf43317a","Type":"ContainerDied","Data":"ffa428e52a3c6324be6bace33035c1626678061a65e2badd389c6e93850ce25f"} Mar 21 03:49:15 crc kubenswrapper[4685]: I0321 03:49:15.470529 4685 scope.go:117] "RemoveContainer" containerID="a947393ce78ff4c9dddd93eb5ee7cb034673ebb6f0391cb868b473028074aa46" Mar 21 03:49:15 crc kubenswrapper[4685]: I0321 03:49:15.471032 4685 scope.go:117] "RemoveContainer" containerID="ffa428e52a3c6324be6bace33035c1626678061a65e2badd389c6e93850ce25f" Mar 21 03:49:15 crc kubenswrapper[4685]: E0321 03:49:15.471252 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-7jcm2_openshift-multus(cd9b1743-6b69-46d3-a429-6f83bf43317a)\"" pod="openshift-multus/multus-7jcm2" podUID="cd9b1743-6b69-46d3-a429-6f83bf43317a" Mar 21 03:49:16 crc kubenswrapper[4685]: I0321 03:49:16.476462 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7jcm2_cd9b1743-6b69-46d3-a429-6f83bf43317a/kube-multus/1.log" Mar 21 03:49:17 crc kubenswrapper[4685]: I0321 03:49:17.300264 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 03:49:17 crc kubenswrapper[4685]: I0321 03:49:17.300420 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 03:49:17 crc kubenswrapper[4685]: I0321 03:49:17.300305 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 03:49:17 crc kubenswrapper[4685]: E0321 03:49:17.300529 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 03:49:17 crc kubenswrapper[4685]: E0321 03:49:17.300882 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 03:49:17 crc kubenswrapper[4685]: I0321 03:49:17.300985 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v9rdl" Mar 21 03:49:17 crc kubenswrapper[4685]: E0321 03:49:17.301099 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 03:49:17 crc kubenswrapper[4685]: E0321 03:49:17.301221 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v9rdl" podUID="fda9b1ff-e4a8-4d15-8f7b-2974991cd252" Mar 21 03:49:18 crc kubenswrapper[4685]: E0321 03:49:18.425179 4685 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 03:49:19 crc kubenswrapper[4685]: I0321 03:49:19.300981 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 03:49:19 crc kubenswrapper[4685]: E0321 03:49:19.301157 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 03:49:19 crc kubenswrapper[4685]: I0321 03:49:19.301201 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 03:49:19 crc kubenswrapper[4685]: I0321 03:49:19.301294 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v9rdl" Mar 21 03:49:19 crc kubenswrapper[4685]: E0321 03:49:19.301350 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 03:49:19 crc kubenswrapper[4685]: I0321 03:49:19.301222 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 03:49:19 crc kubenswrapper[4685]: E0321 03:49:19.301489 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v9rdl" podUID="fda9b1ff-e4a8-4d15-8f7b-2974991cd252" Mar 21 03:49:19 crc kubenswrapper[4685]: E0321 03:49:19.301558 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 03:49:21 crc kubenswrapper[4685]: I0321 03:49:21.300642 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v9rdl" Mar 21 03:49:21 crc kubenswrapper[4685]: I0321 03:49:21.300713 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 03:49:21 crc kubenswrapper[4685]: I0321 03:49:21.300658 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 03:49:21 crc kubenswrapper[4685]: E0321 03:49:21.300789 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v9rdl" podUID="fda9b1ff-e4a8-4d15-8f7b-2974991cd252" Mar 21 03:49:21 crc kubenswrapper[4685]: E0321 03:49:21.300912 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 03:49:21 crc kubenswrapper[4685]: I0321 03:49:21.301101 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 03:49:21 crc kubenswrapper[4685]: E0321 03:49:21.301215 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 03:49:21 crc kubenswrapper[4685]: E0321 03:49:21.301448 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 03:49:23 crc kubenswrapper[4685]: I0321 03:49:23.299970 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 03:49:23 crc kubenswrapper[4685]: I0321 03:49:23.300042 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 03:49:23 crc kubenswrapper[4685]: I0321 03:49:23.300053 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v9rdl" Mar 21 03:49:23 crc kubenswrapper[4685]: E0321 03:49:23.300271 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 03:49:23 crc kubenswrapper[4685]: E0321 03:49:23.300345 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v9rdl" podUID="fda9b1ff-e4a8-4d15-8f7b-2974991cd252" Mar 21 03:49:23 crc kubenswrapper[4685]: I0321 03:49:23.300057 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 03:49:23 crc kubenswrapper[4685]: E0321 03:49:23.300168 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 03:49:23 crc kubenswrapper[4685]: E0321 03:49:23.300425 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 03:49:23 crc kubenswrapper[4685]: E0321 03:49:23.426093 4685 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 03:49:25 crc kubenswrapper[4685]: I0321 03:49:25.300576 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 03:49:25 crc kubenswrapper[4685]: I0321 03:49:25.300667 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 03:49:25 crc kubenswrapper[4685]: I0321 03:49:25.300598 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v9rdl" Mar 21 03:49:25 crc kubenswrapper[4685]: E0321 03:49:25.300707 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 03:49:25 crc kubenswrapper[4685]: I0321 03:49:25.300675 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 03:49:25 crc kubenswrapper[4685]: E0321 03:49:25.300821 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 03:49:25 crc kubenswrapper[4685]: E0321 03:49:25.301040 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v9rdl" podUID="fda9b1ff-e4a8-4d15-8f7b-2974991cd252" Mar 21 03:49:25 crc kubenswrapper[4685]: E0321 03:49:25.301105 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 03:49:26 crc kubenswrapper[4685]: I0321 03:49:26.301135 4685 scope.go:117] "RemoveContainer" containerID="f4c5056ea79326435f55aafe0fc9580a47c394a0d04559cd50b5eb720b2f599a" Mar 21 03:49:27 crc kubenswrapper[4685]: I0321 03:49:27.288787 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-v9rdl"] Mar 21 03:49:27 crc kubenswrapper[4685]: I0321 03:49:27.289235 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v9rdl" Mar 21 03:49:27 crc kubenswrapper[4685]: E0321 03:49:27.289334 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v9rdl" podUID="fda9b1ff-e4a8-4d15-8f7b-2974991cd252" Mar 21 03:49:27 crc kubenswrapper[4685]: I0321 03:49:27.300557 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 03:49:27 crc kubenswrapper[4685]: I0321 03:49:27.300639 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 03:49:27 crc kubenswrapper[4685]: E0321 03:49:27.300675 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 03:49:27 crc kubenswrapper[4685]: I0321 03:49:27.300565 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 03:49:27 crc kubenswrapper[4685]: E0321 03:49:27.300757 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 03:49:27 crc kubenswrapper[4685]: E0321 03:49:27.301033 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 03:49:27 crc kubenswrapper[4685]: I0321 03:49:27.515183 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cpfzk_08dfc393-0ddb-4bde-9b1f-2a48549f4549/ovnkube-controller/3.log" Mar 21 03:49:27 crc kubenswrapper[4685]: I0321 03:49:27.517546 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" event={"ID":"08dfc393-0ddb-4bde-9b1f-2a48549f4549","Type":"ContainerStarted","Data":"e3b8a82bebc823be171b9425db7ef881f4918b904d172973b06ee5b15fe79a15"} Mar 21 03:49:27 crc kubenswrapper[4685]: I0321 03:49:27.517886 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" Mar 21 03:49:27 crc kubenswrapper[4685]: I0321 03:49:27.549659 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" podStartSLOduration=157.549641946 podStartE2EDuration="2m37.549641946s" podCreationTimestamp="2026-03-21 03:46:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 03:49:27.548160941 +0000 UTC m=+200.025229733" watchObservedRunningTime="2026-03-21 03:49:27.549641946 +0000 UTC m=+200.026710738" Mar 21 03:49:28 crc kubenswrapper[4685]: E0321 03:49:28.426940 4685 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 03:49:29 crc kubenswrapper[4685]: I0321 03:49:29.300663 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 03:49:29 crc kubenswrapper[4685]: I0321 03:49:29.300786 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 03:49:29 crc kubenswrapper[4685]: I0321 03:49:29.300828 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v9rdl" Mar 21 03:49:29 crc kubenswrapper[4685]: I0321 03:49:29.300856 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 03:49:29 crc kubenswrapper[4685]: E0321 03:49:29.300949 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 03:49:29 crc kubenswrapper[4685]: I0321 03:49:29.301052 4685 scope.go:117] "RemoveContainer" containerID="ffa428e52a3c6324be6bace33035c1626678061a65e2badd389c6e93850ce25f" Mar 21 03:49:29 crc kubenswrapper[4685]: E0321 03:49:29.301130 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 03:49:29 crc kubenswrapper[4685]: E0321 03:49:29.301259 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v9rdl" podUID="fda9b1ff-e4a8-4d15-8f7b-2974991cd252" Mar 21 03:49:29 crc kubenswrapper[4685]: E0321 03:49:29.301343 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 03:49:29 crc kubenswrapper[4685]: I0321 03:49:29.525106 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7jcm2_cd9b1743-6b69-46d3-a429-6f83bf43317a/kube-multus/1.log" Mar 21 03:49:29 crc kubenswrapper[4685]: I0321 03:49:29.525456 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7jcm2" event={"ID":"cd9b1743-6b69-46d3-a429-6f83bf43317a","Type":"ContainerStarted","Data":"aeb2e6d1910f6dc402503b823272d26ca9f0ffb3b41c3137e50a4345d1710170"} Mar 21 03:49:31 crc kubenswrapper[4685]: I0321 03:49:31.300057 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v9rdl" Mar 21 03:49:31 crc kubenswrapper[4685]: I0321 03:49:31.300085 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 03:49:31 crc kubenswrapper[4685]: I0321 03:49:31.300124 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 03:49:31 crc kubenswrapper[4685]: E0321 03:49:31.300916 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 03:49:31 crc kubenswrapper[4685]: E0321 03:49:31.300730 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v9rdl" podUID="fda9b1ff-e4a8-4d15-8f7b-2974991cd252" Mar 21 03:49:31 crc kubenswrapper[4685]: I0321 03:49:31.300171 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 03:49:31 crc kubenswrapper[4685]: E0321 03:49:31.301013 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 03:49:31 crc kubenswrapper[4685]: E0321 03:49:31.301158 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 03:49:33 crc kubenswrapper[4685]: I0321 03:49:33.300737 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v9rdl" Mar 21 03:49:33 crc kubenswrapper[4685]: E0321 03:49:33.300926 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v9rdl" podUID="fda9b1ff-e4a8-4d15-8f7b-2974991cd252" Mar 21 03:49:33 crc kubenswrapper[4685]: I0321 03:49:33.301059 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 03:49:33 crc kubenswrapper[4685]: E0321 03:49:33.301198 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 03:49:33 crc kubenswrapper[4685]: I0321 03:49:33.301192 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 03:49:33 crc kubenswrapper[4685]: I0321 03:49:33.301254 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 03:49:33 crc kubenswrapper[4685]: E0321 03:49:33.301269 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 03:49:33 crc kubenswrapper[4685]: E0321 03:49:33.301478 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 03:49:35 crc kubenswrapper[4685]: I0321 03:49:35.300240 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 03:49:35 crc kubenswrapper[4685]: I0321 03:49:35.300327 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v9rdl" Mar 21 03:49:35 crc kubenswrapper[4685]: I0321 03:49:35.300367 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 03:49:35 crc kubenswrapper[4685]: I0321 03:49:35.300725 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 03:49:35 crc kubenswrapper[4685]: I0321 03:49:35.306455 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 21 03:49:35 crc kubenswrapper[4685]: I0321 03:49:35.307066 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 21 03:49:35 crc kubenswrapper[4685]: I0321 03:49:35.307428 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 21 03:49:35 crc kubenswrapper[4685]: I0321 03:49:35.307885 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 21 03:49:35 crc kubenswrapper[4685]: I0321 03:49:35.308194 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 21 03:49:35 crc kubenswrapper[4685]: I0321 03:49:35.308311 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 21 03:49:39 crc kubenswrapper[4685]: I0321 03:49:39.684986 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" Mar 21 03:49:39 crc kubenswrapper[4685]: I0321 03:49:39.685586 4685 patch_prober.go:28] interesting pod/machine-config-daemon-7r9cg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 03:49:39 crc kubenswrapper[4685]: I0321 03:49:39.685683 4685 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" podUID="cea46fe2-4e41-43ab-a069-cb30fb4e732c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.418017 4685 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.475482 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-864h4"] Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.476301 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-864h4" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.481900 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-plggv"] Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.482619 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-plggv" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.484363 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.484538 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.484912 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.488403 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.488482 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.488761 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.492047 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-flxks"] Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.493095 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-flxks" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.493288 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.493673 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.493974 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.494307 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.495991 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.496230 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.496394 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.498334 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.503530 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.503867 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.504012 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.504117 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.504340 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.509062 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.509787 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.510262 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.513243 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.513350 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-mkjd2"] Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.513820 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-84ftw"] Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.514310 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-84ftw" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.514395 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-mkjd2" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.515026 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-m7q8h"] Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.515742 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-m7q8h" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.517213 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.517253 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.520100 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n6cmp"] Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.520699 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n6cmp" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.521346 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.521568 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.521598 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.521607 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.521869 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.521999 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.522129 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.522298 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.522649 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.522817 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.522970 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.523104 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.523148 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.523263 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.523332 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.523359 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.523454 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.523515 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.542900 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-2ptjk"] Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.545930 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.547413 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.552751 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.553017 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.553564 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.554130 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.554326 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-cd44l"] Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.554828 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-cd44l" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.555308 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2ptjk" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.558148 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.559216 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.559424 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2tvkz"] Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.559925 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.560221 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2tvkz" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.561629 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.566198 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.566567 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.566883 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.567085 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.567161 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.569551 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.569605 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.569789 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.569903 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.569989 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.569995 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.570057 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.569792 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.571214 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.571454 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.574652 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.579896 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.580175 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2ckzq"] Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.580556 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-tb9dl"] Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.580974 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tb9dl" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.581033 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2ckzq" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.581664 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-j8qrk"] Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.582155 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.586645 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-djlmv"] Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.586759 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.586912 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-97584"] Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.587080 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-j8qrk" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.587141 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.587177 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.587203 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.587210 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-97584" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.587350 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.587471 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.587570 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-djlmv" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.587575 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.587663 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.588090 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.588221 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.588255 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.588268 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.588410 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.588462 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.592573 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.593146 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.593186 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.592634 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.592661 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.593368 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.593377 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-p5r46"] Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.593792 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6tr2m"] Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.593945 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ce8529fe-4546-466d-bb07-3ee73cf1bc1f-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-m7q8h\" (UID: \"ce8529fe-4546-466d-bb07-3ee73cf1bc1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7q8h" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.593969 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6b5xg\" (UniqueName: \"kubernetes.io/projected/cd1c8c06-710c-401b-803e-9cc18aa1b4b6-kube-api-access-6b5xg\") pod \"controller-manager-879f6c89f-mkjd2\" (UID: \"cd1c8c06-710c-401b-803e-9cc18aa1b4b6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mkjd2" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.593993 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbxpx\" (UniqueName: \"kubernetes.io/projected/588eb87c-d2c0-45fb-a0f7-33de36d5d745-kube-api-access-hbxpx\") pod \"route-controller-manager-6576b87f9c-plggv\" (UID: \"588eb87c-d2c0-45fb-a0f7-33de36d5d745\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-plggv" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.594011 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ce8529fe-4546-466d-bb07-3ee73cf1bc1f-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-m7q8h\" (UID: \"ce8529fe-4546-466d-bb07-3ee73cf1bc1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7q8h" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.594028 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hf5rg\" (UniqueName: \"kubernetes.io/projected/ce8529fe-4546-466d-bb07-3ee73cf1bc1f-kube-api-access-hf5rg\") pod \"oauth-openshift-558db77b4-m7q8h\" (UID: \"ce8529fe-4546-466d-bb07-3ee73cf1bc1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7q8h" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.594044 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd1c8c06-710c-401b-803e-9cc18aa1b4b6-serving-cert\") pod \"controller-manager-879f6c89f-mkjd2\" (UID: \"cd1c8c06-710c-401b-803e-9cc18aa1b4b6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mkjd2" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.594058 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ce8529fe-4546-466d-bb07-3ee73cf1bc1f-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-m7q8h\" (UID: \"ce8529fe-4546-466d-bb07-3ee73cf1bc1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7q8h" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.594072 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/9c1c82f3-080b-47ea-93df-596d79aa2bf8-image-import-ca\") pod \"apiserver-76f77b778f-864h4\" (UID: \"9c1c82f3-080b-47ea-93df-596d79aa2bf8\") " pod="openshift-apiserver/apiserver-76f77b778f-864h4" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.594088 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/9c1c82f3-080b-47ea-93df-596d79aa2bf8-encryption-config\") pod \"apiserver-76f77b778f-864h4\" (UID: \"9c1c82f3-080b-47ea-93df-596d79aa2bf8\") " pod="openshift-apiserver/apiserver-76f77b778f-864h4" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.594109 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/588eb87c-d2c0-45fb-a0f7-33de36d5d745-client-ca\") pod \"route-controller-manager-6576b87f9c-plggv\" (UID: \"588eb87c-d2c0-45fb-a0f7-33de36d5d745\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-plggv" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.594128 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/1ee95c71-cb75-4357-aeff-c0417a0c6eb3-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-flxks\" (UID: \"1ee95c71-cb75-4357-aeff-c0417a0c6eb3\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-flxks" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.594161 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/9c1c82f3-080b-47ea-93df-596d79aa2bf8-node-pullsecrets\") pod \"apiserver-76f77b778f-864h4\" (UID: \"9c1c82f3-080b-47ea-93df-596d79aa2bf8\") " pod="openshift-apiserver/apiserver-76f77b778f-864h4" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.594185 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ce8529fe-4546-466d-bb07-3ee73cf1bc1f-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-m7q8h\" (UID: \"ce8529fe-4546-466d-bb07-3ee73cf1bc1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7q8h" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.594203 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c1c82f3-080b-47ea-93df-596d79aa2bf8-serving-cert\") pod \"apiserver-76f77b778f-864h4\" (UID: \"9c1c82f3-080b-47ea-93df-596d79aa2bf8\") " pod="openshift-apiserver/apiserver-76f77b778f-864h4" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.594219 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd1c8c06-710c-401b-803e-9cc18aa1b4b6-config\") pod \"controller-manager-879f6c89f-mkjd2\" (UID: \"cd1c8c06-710c-401b-803e-9cc18aa1b4b6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mkjd2" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.594237 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c1c82f3-080b-47ea-93df-596d79aa2bf8-config\") pod \"apiserver-76f77b778f-864h4\" (UID: \"9c1c82f3-080b-47ea-93df-596d79aa2bf8\") " pod="openshift-apiserver/apiserver-76f77b778f-864h4" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.594252 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9c1c82f3-080b-47ea-93df-596d79aa2bf8-audit-dir\") pod \"apiserver-76f77b778f-864h4\" (UID: \"9c1c82f3-080b-47ea-93df-596d79aa2bf8\") " pod="openshift-apiserver/apiserver-76f77b778f-864h4" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.594266 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/588eb87c-d2c0-45fb-a0f7-33de36d5d745-serving-cert\") pod \"route-controller-manager-6576b87f9c-plggv\" (UID: \"588eb87c-d2c0-45fb-a0f7-33de36d5d745\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-plggv" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.594280 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ce8529fe-4546-466d-bb07-3ee73cf1bc1f-audit-policies\") pod \"oauth-openshift-558db77b4-m7q8h\" (UID: \"ce8529fe-4546-466d-bb07-3ee73cf1bc1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7q8h" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.594296 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fw6qm\" (UniqueName: \"kubernetes.io/projected/1ee95c71-cb75-4357-aeff-c0417a0c6eb3-kube-api-access-fw6qm\") pod \"machine-api-operator-5694c8668f-flxks\" (UID: \"1ee95c71-cb75-4357-aeff-c0417a0c6eb3\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-flxks" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.594311 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6n67\" (UniqueName: \"kubernetes.io/projected/e6a042fb-9acf-4d69-9583-23ccf76753f8-kube-api-access-k6n67\") pod \"machine-approver-56656f9798-84ftw\" (UID: \"e6a042fb-9acf-4d69-9583-23ccf76753f8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-84ftw" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.594334 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ce8529fe-4546-466d-bb07-3ee73cf1bc1f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-m7q8h\" (UID: \"ce8529fe-4546-466d-bb07-3ee73cf1bc1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7q8h" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.594349 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2w7ld\" (UniqueName: \"kubernetes.io/projected/db754bd5-f41e-4467-ba57-f89651b75dd1-kube-api-access-2w7ld\") pod \"openshift-apiserver-operator-796bbdcf4f-n6cmp\" (UID: \"db754bd5-f41e-4467-ba57-f89651b75dd1\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n6cmp" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.594366 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ce8529fe-4546-466d-bb07-3ee73cf1bc1f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-m7q8h\" (UID: \"ce8529fe-4546-466d-bb07-3ee73cf1bc1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7q8h" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.594383 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ce8529fe-4546-466d-bb07-3ee73cf1bc1f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-m7q8h\" (UID: \"ce8529fe-4546-466d-bb07-3ee73cf1bc1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7q8h" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.594397 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/588eb87c-d2c0-45fb-a0f7-33de36d5d745-config\") pod \"route-controller-manager-6576b87f9c-plggv\" (UID: \"588eb87c-d2c0-45fb-a0f7-33de36d5d745\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-plggv" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.594411 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/9c1c82f3-080b-47ea-93df-596d79aa2bf8-etcd-serving-ca\") pod \"apiserver-76f77b778f-864h4\" (UID: \"9c1c82f3-080b-47ea-93df-596d79aa2bf8\") " pod="openshift-apiserver/apiserver-76f77b778f-864h4" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.594424 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c1c82f3-080b-47ea-93df-596d79aa2bf8-trusted-ca-bundle\") pod \"apiserver-76f77b778f-864h4\" (UID: \"9c1c82f3-080b-47ea-93df-596d79aa2bf8\") " pod="openshift-apiserver/apiserver-76f77b778f-864h4" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.594441 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ce8529fe-4546-466d-bb07-3ee73cf1bc1f-audit-dir\") pod \"oauth-openshift-558db77b4-m7q8h\" (UID: \"ce8529fe-4546-466d-bb07-3ee73cf1bc1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7q8h" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.594456 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/e6a042fb-9acf-4d69-9583-23ccf76753f8-machine-approver-tls\") pod \"machine-approver-56656f9798-84ftw\" (UID: \"e6a042fb-9acf-4d69-9583-23ccf76753f8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-84ftw" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.594473 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9c1c82f3-080b-47ea-93df-596d79aa2bf8-etcd-client\") pod \"apiserver-76f77b778f-864h4\" (UID: \"9c1c82f3-080b-47ea-93df-596d79aa2bf8\") " pod="openshift-apiserver/apiserver-76f77b778f-864h4" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.594514 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ce8529fe-4546-466d-bb07-3ee73cf1bc1f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-m7q8h\" (UID: \"ce8529fe-4546-466d-bb07-3ee73cf1bc1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7q8h" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.594531 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6a042fb-9acf-4d69-9583-23ccf76753f8-config\") pod \"machine-approver-56656f9798-84ftw\" (UID: \"e6a042fb-9acf-4d69-9583-23ccf76753f8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-84ftw" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.594558 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1ee95c71-cb75-4357-aeff-c0417a0c6eb3-images\") pod \"machine-api-operator-5694c8668f-flxks\" (UID: \"1ee95c71-cb75-4357-aeff-c0417a0c6eb3\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-flxks" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.594575 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/9c1c82f3-080b-47ea-93df-596d79aa2bf8-audit\") pod \"apiserver-76f77b778f-864h4\" (UID: \"9c1c82f3-080b-47ea-93df-596d79aa2bf8\") " pod="openshift-apiserver/apiserver-76f77b778f-864h4" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.594591 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e6a042fb-9acf-4d69-9583-23ccf76753f8-auth-proxy-config\") pod \"machine-approver-56656f9798-84ftw\" (UID: \"e6a042fb-9acf-4d69-9583-23ccf76753f8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-84ftw" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.594605 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ce8529fe-4546-466d-bb07-3ee73cf1bc1f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-m7q8h\" (UID: \"ce8529fe-4546-466d-bb07-3ee73cf1bc1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7q8h" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.594621 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/db754bd5-f41e-4467-ba57-f89651b75dd1-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-n6cmp\" (UID: \"db754bd5-f41e-4467-ba57-f89651b75dd1\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n6cmp" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.594634 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cd1c8c06-710c-401b-803e-9cc18aa1b4b6-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-mkjd2\" (UID: \"cd1c8c06-710c-401b-803e-9cc18aa1b4b6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mkjd2" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.594651 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ce8529fe-4546-466d-bb07-3ee73cf1bc1f-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-m7q8h\" (UID: \"ce8529fe-4546-466d-bb07-3ee73cf1bc1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7q8h" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.594665 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ce8529fe-4546-466d-bb07-3ee73cf1bc1f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-m7q8h\" (UID: \"ce8529fe-4546-466d-bb07-3ee73cf1bc1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7q8h" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.594680 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzbmm\" (UniqueName: \"kubernetes.io/projected/9c1c82f3-080b-47ea-93df-596d79aa2bf8-kube-api-access-xzbmm\") pod \"apiserver-76f77b778f-864h4\" (UID: \"9c1c82f3-080b-47ea-93df-596d79aa2bf8\") " pod="openshift-apiserver/apiserver-76f77b778f-864h4" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.594699 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ee95c71-cb75-4357-aeff-c0417a0c6eb3-config\") pod \"machine-api-operator-5694c8668f-flxks\" (UID: \"1ee95c71-cb75-4357-aeff-c0417a0c6eb3\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-flxks" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.594714 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db754bd5-f41e-4467-ba57-f89651b75dd1-config\") pod \"openshift-apiserver-operator-796bbdcf4f-n6cmp\" (UID: \"db754bd5-f41e-4467-ba57-f89651b75dd1\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n6cmp" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.599114 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cd1c8c06-710c-401b-803e-9cc18aa1b4b6-client-ca\") pod \"controller-manager-879f6c89f-mkjd2\" (UID: \"cd1c8c06-710c-401b-803e-9cc18aa1b4b6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mkjd2" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.601851 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.602126 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.602356 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.602604 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.604497 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-pjvx8"] Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.605313 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.605771 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.605955 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.606031 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6tr2m" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.606212 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.605860 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-clz2m"] Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.606774 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.607268 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.607624 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-p5r46" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.614063 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-jjffn"] Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.615084 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jjffn" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.616013 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-clz2m" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.636704 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.641281 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-52mpk"] Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.642107 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-vds97"] Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.642452 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-tg9t2"] Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.647357 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-5bwhq"] Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.647881 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5bwhq" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.647995 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-tg9t2" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.643024 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-52mpk" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.643921 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-vds97" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.652599 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.654112 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.654426 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.658328 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.662417 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dk5lq"] Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.663104 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-2jl6z"] Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.663625 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-l8n7r"] Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.664096 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.664189 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l8n7r" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.664312 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dk5lq" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.665759 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2jl6z" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.667645 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-m9xxt"] Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.668136 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-m9xxt" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.668821 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-646sv"] Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.698062 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mlpft"] Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.698183 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-646sv" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.700108 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ce8529fe-4546-466d-bb07-3ee73cf1bc1f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-m7q8h\" (UID: \"ce8529fe-4546-466d-bb07-3ee73cf1bc1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7q8h" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.700145 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ce8529fe-4546-466d-bb07-3ee73cf1bc1f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-m7q8h\" (UID: \"ce8529fe-4546-466d-bb07-3ee73cf1bc1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7q8h" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.700174 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2w7ld\" (UniqueName: \"kubernetes.io/projected/db754bd5-f41e-4467-ba57-f89651b75dd1-kube-api-access-2w7ld\") pod \"openshift-apiserver-operator-796bbdcf4f-n6cmp\" (UID: \"db754bd5-f41e-4467-ba57-f89651b75dd1\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n6cmp" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.700199 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ce8529fe-4546-466d-bb07-3ee73cf1bc1f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-m7q8h\" (UID: \"ce8529fe-4546-466d-bb07-3ee73cf1bc1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7q8h" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.700225 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/9c1c82f3-080b-47ea-93df-596d79aa2bf8-etcd-serving-ca\") pod \"apiserver-76f77b778f-864h4\" (UID: \"9c1c82f3-080b-47ea-93df-596d79aa2bf8\") " pod="openshift-apiserver/apiserver-76f77b778f-864h4" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.700250 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c1c82f3-080b-47ea-93df-596d79aa2bf8-trusted-ca-bundle\") pod \"apiserver-76f77b778f-864h4\" (UID: \"9c1c82f3-080b-47ea-93df-596d79aa2bf8\") " pod="openshift-apiserver/apiserver-76f77b778f-864h4" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.700267 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-n9mn4"] Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.700677 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fb258"] Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.701010 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-skpjj"] Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.701298 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-pcwhp"] Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.701559 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/588eb87c-d2c0-45fb-a0f7-33de36d5d745-config\") pod \"route-controller-manager-6576b87f9c-plggv\" (UID: \"588eb87c-d2c0-45fb-a0f7-33de36d5d745\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-plggv" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.701592 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/9c1c82f3-080b-47ea-93df-596d79aa2bf8-etcd-serving-ca\") pod \"apiserver-76f77b778f-864h4\" (UID: \"9c1c82f3-080b-47ea-93df-596d79aa2bf8\") " pod="openshift-apiserver/apiserver-76f77b778f-864h4" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.701608 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-pcwhp" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.701658 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mlpft" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.700270 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/588eb87c-d2c0-45fb-a0f7-33de36d5d745-config\") pod \"route-controller-manager-6576b87f9c-plggv\" (UID: \"588eb87c-d2c0-45fb-a0f7-33de36d5d745\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-plggv" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.701690 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ce8529fe-4546-466d-bb07-3ee73cf1bc1f-audit-dir\") pod \"oauth-openshift-558db77b4-m7q8h\" (UID: \"ce8529fe-4546-466d-bb07-3ee73cf1bc1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7q8h" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.701705 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/e6a042fb-9acf-4d69-9583-23ccf76753f8-machine-approver-tls\") pod \"machine-approver-56656f9798-84ftw\" (UID: \"e6a042fb-9acf-4d69-9583-23ccf76753f8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-84ftw" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.701730 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9c1c82f3-080b-47ea-93df-596d79aa2bf8-etcd-client\") pod \"apiserver-76f77b778f-864h4\" (UID: \"9c1c82f3-080b-47ea-93df-596d79aa2bf8\") " pod="openshift-apiserver/apiserver-76f77b778f-864h4" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.701744 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6a042fb-9acf-4d69-9583-23ccf76753f8-config\") pod \"machine-approver-56656f9798-84ftw\" (UID: \"e6a042fb-9acf-4d69-9583-23ccf76753f8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-84ftw" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.701758 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1ee95c71-cb75-4357-aeff-c0417a0c6eb3-images\") pod \"machine-api-operator-5694c8668f-flxks\" (UID: \"1ee95c71-cb75-4357-aeff-c0417a0c6eb3\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-flxks" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.701773 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ce8529fe-4546-466d-bb07-3ee73cf1bc1f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-m7q8h\" (UID: \"ce8529fe-4546-466d-bb07-3ee73cf1bc1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7q8h" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.701788 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/9c1c82f3-080b-47ea-93df-596d79aa2bf8-audit\") pod \"apiserver-76f77b778f-864h4\" (UID: \"9c1c82f3-080b-47ea-93df-596d79aa2bf8\") " pod="openshift-apiserver/apiserver-76f77b778f-864h4" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.701805 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e6a042fb-9acf-4d69-9583-23ccf76753f8-auth-proxy-config\") pod \"machine-approver-56656f9798-84ftw\" (UID: \"e6a042fb-9acf-4d69-9583-23ccf76753f8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-84ftw" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.701822 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ce8529fe-4546-466d-bb07-3ee73cf1bc1f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-m7q8h\" (UID: \"ce8529fe-4546-466d-bb07-3ee73cf1bc1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7q8h" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.701859 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/db754bd5-f41e-4467-ba57-f89651b75dd1-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-n6cmp\" (UID: \"db754bd5-f41e-4467-ba57-f89651b75dd1\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n6cmp" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.701874 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cd1c8c06-710c-401b-803e-9cc18aa1b4b6-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-mkjd2\" (UID: \"cd1c8c06-710c-401b-803e-9cc18aa1b4b6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mkjd2" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.701889 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ce8529fe-4546-466d-bb07-3ee73cf1bc1f-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-m7q8h\" (UID: \"ce8529fe-4546-466d-bb07-3ee73cf1bc1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7q8h" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.701905 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzbmm\" (UniqueName: \"kubernetes.io/projected/9c1c82f3-080b-47ea-93df-596d79aa2bf8-kube-api-access-xzbmm\") pod \"apiserver-76f77b778f-864h4\" (UID: \"9c1c82f3-080b-47ea-93df-596d79aa2bf8\") " pod="openshift-apiserver/apiserver-76f77b778f-864h4" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.701918 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ee95c71-cb75-4357-aeff-c0417a0c6eb3-config\") pod \"machine-api-operator-5694c8668f-flxks\" (UID: \"1ee95c71-cb75-4357-aeff-c0417a0c6eb3\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-flxks" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.701936 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ce8529fe-4546-466d-bb07-3ee73cf1bc1f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-m7q8h\" (UID: \"ce8529fe-4546-466d-bb07-3ee73cf1bc1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7q8h" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.701951 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db754bd5-f41e-4467-ba57-f89651b75dd1-config\") pod \"openshift-apiserver-operator-796bbdcf4f-n6cmp\" (UID: \"db754bd5-f41e-4467-ba57-f89651b75dd1\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n6cmp" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.701963 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-skpjj" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.701975 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cd1c8c06-710c-401b-803e-9cc18aa1b4b6-client-ca\") pod \"controller-manager-879f6c89f-mkjd2\" (UID: \"cd1c8c06-710c-401b-803e-9cc18aa1b4b6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mkjd2" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.702002 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ce8529fe-4546-466d-bb07-3ee73cf1bc1f-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-m7q8h\" (UID: \"ce8529fe-4546-466d-bb07-3ee73cf1bc1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7q8h" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.701565 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-n8ttn"] Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.702369 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.702459 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-n9mn4" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.702578 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c1c82f3-080b-47ea-93df-596d79aa2bf8-trusted-ca-bundle\") pod \"apiserver-76f77b778f-864h4\" (UID: \"9c1c82f3-080b-47ea-93df-596d79aa2bf8\") " pod="openshift-apiserver/apiserver-76f77b778f-864h4" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.702976 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ce8529fe-4546-466d-bb07-3ee73cf1bc1f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-m7q8h\" (UID: \"ce8529fe-4546-466d-bb07-3ee73cf1bc1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7q8h" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.703105 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ce8529fe-4546-466d-bb07-3ee73cf1bc1f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-m7q8h\" (UID: \"ce8529fe-4546-466d-bb07-3ee73cf1bc1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7q8h" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.703369 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1ee95c71-cb75-4357-aeff-c0417a0c6eb3-images\") pod \"machine-api-operator-5694c8668f-flxks\" (UID: \"1ee95c71-cb75-4357-aeff-c0417a0c6eb3\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-flxks" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.703423 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ce8529fe-4546-466d-bb07-3ee73cf1bc1f-audit-dir\") pod \"oauth-openshift-558db77b4-m7q8h\" (UID: \"ce8529fe-4546-466d-bb07-3ee73cf1bc1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7q8h" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.703457 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-qflxg"] Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.703589 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db754bd5-f41e-4467-ba57-f89651b75dd1-config\") pod \"openshift-apiserver-operator-796bbdcf4f-n6cmp\" (UID: \"db754bd5-f41e-4467-ba57-f89651b75dd1\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n6cmp" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.702022 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6b5xg\" (UniqueName: \"kubernetes.io/projected/cd1c8c06-710c-401b-803e-9cc18aa1b4b6-kube-api-access-6b5xg\") pod \"controller-manager-879f6c89f-mkjd2\" (UID: \"cd1c8c06-710c-401b-803e-9cc18aa1b4b6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mkjd2" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.703706 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ce8529fe-4546-466d-bb07-3ee73cf1bc1f-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-m7q8h\" (UID: \"ce8529fe-4546-466d-bb07-3ee73cf1bc1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7q8h" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.703723 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hf5rg\" (UniqueName: \"kubernetes.io/projected/ce8529fe-4546-466d-bb07-3ee73cf1bc1f-kube-api-access-hf5rg\") pod \"oauth-openshift-558db77b4-m7q8h\" (UID: \"ce8529fe-4546-466d-bb07-3ee73cf1bc1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7q8h" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.703741 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbxpx\" (UniqueName: \"kubernetes.io/projected/588eb87c-d2c0-45fb-a0f7-33de36d5d745-kube-api-access-hbxpx\") pod \"route-controller-manager-6576b87f9c-plggv\" (UID: \"588eb87c-d2c0-45fb-a0f7-33de36d5d745\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-plggv" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.703756 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/9c1c82f3-080b-47ea-93df-596d79aa2bf8-image-import-ca\") pod \"apiserver-76f77b778f-864h4\" (UID: \"9c1c82f3-080b-47ea-93df-596d79aa2bf8\") " pod="openshift-apiserver/apiserver-76f77b778f-864h4" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.703775 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd1c8c06-710c-401b-803e-9cc18aa1b4b6-serving-cert\") pod \"controller-manager-879f6c89f-mkjd2\" (UID: \"cd1c8c06-710c-401b-803e-9cc18aa1b4b6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mkjd2" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.703790 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ce8529fe-4546-466d-bb07-3ee73cf1bc1f-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-m7q8h\" (UID: \"ce8529fe-4546-466d-bb07-3ee73cf1bc1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7q8h" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.703813 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/9c1c82f3-080b-47ea-93df-596d79aa2bf8-encryption-config\") pod \"apiserver-76f77b778f-864h4\" (UID: \"9c1c82f3-080b-47ea-93df-596d79aa2bf8\") " pod="openshift-apiserver/apiserver-76f77b778f-864h4" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.703831 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/1ee95c71-cb75-4357-aeff-c0417a0c6eb3-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-flxks\" (UID: \"1ee95c71-cb75-4357-aeff-c0417a0c6eb3\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-flxks" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.703862 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/588eb87c-d2c0-45fb-a0f7-33de36d5d745-client-ca\") pod \"route-controller-manager-6576b87f9c-plggv\" (UID: \"588eb87c-d2c0-45fb-a0f7-33de36d5d745\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-plggv" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.703878 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/9c1c82f3-080b-47ea-93df-596d79aa2bf8-node-pullsecrets\") pod \"apiserver-76f77b778f-864h4\" (UID: \"9c1c82f3-080b-47ea-93df-596d79aa2bf8\") " pod="openshift-apiserver/apiserver-76f77b778f-864h4" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.703901 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ce8529fe-4546-466d-bb07-3ee73cf1bc1f-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-m7q8h\" (UID: \"ce8529fe-4546-466d-bb07-3ee73cf1bc1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7q8h" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.703915 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c1c82f3-080b-47ea-93df-596d79aa2bf8-serving-cert\") pod \"apiserver-76f77b778f-864h4\" (UID: \"9c1c82f3-080b-47ea-93df-596d79aa2bf8\") " pod="openshift-apiserver/apiserver-76f77b778f-864h4" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.703930 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd1c8c06-710c-401b-803e-9cc18aa1b4b6-config\") pod \"controller-manager-879f6c89f-mkjd2\" (UID: \"cd1c8c06-710c-401b-803e-9cc18aa1b4b6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mkjd2" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.703945 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/588eb87c-d2c0-45fb-a0f7-33de36d5d745-serving-cert\") pod \"route-controller-manager-6576b87f9c-plggv\" (UID: \"588eb87c-d2c0-45fb-a0f7-33de36d5d745\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-plggv" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.703962 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ce8529fe-4546-466d-bb07-3ee73cf1bc1f-audit-policies\") pod \"oauth-openshift-558db77b4-m7q8h\" (UID: \"ce8529fe-4546-466d-bb07-3ee73cf1bc1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7q8h" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.703976 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c1c82f3-080b-47ea-93df-596d79aa2bf8-config\") pod \"apiserver-76f77b778f-864h4\" (UID: \"9c1c82f3-080b-47ea-93df-596d79aa2bf8\") " pod="openshift-apiserver/apiserver-76f77b778f-864h4" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.703989 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9c1c82f3-080b-47ea-93df-596d79aa2bf8-audit-dir\") pod \"apiserver-76f77b778f-864h4\" (UID: \"9c1c82f3-080b-47ea-93df-596d79aa2bf8\") " pod="openshift-apiserver/apiserver-76f77b778f-864h4" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.704004 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fw6qm\" (UniqueName: \"kubernetes.io/projected/1ee95c71-cb75-4357-aeff-c0417a0c6eb3-kube-api-access-fw6qm\") pod \"machine-api-operator-5694c8668f-flxks\" (UID: \"1ee95c71-cb75-4357-aeff-c0417a0c6eb3\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-flxks" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.704025 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6n67\" (UniqueName: \"kubernetes.io/projected/e6a042fb-9acf-4d69-9583-23ccf76753f8-kube-api-access-k6n67\") pod \"machine-approver-56656f9798-84ftw\" (UID: \"e6a042fb-9acf-4d69-9583-23ccf76753f8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-84ftw" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.704527 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ee95c71-cb75-4357-aeff-c0417a0c6eb3-config\") pod \"machine-api-operator-5694c8668f-flxks\" (UID: \"1ee95c71-cb75-4357-aeff-c0417a0c6eb3\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-flxks" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.704695 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cd1c8c06-710c-401b-803e-9cc18aa1b4b6-client-ca\") pod \"controller-manager-879f6c89f-mkjd2\" (UID: \"cd1c8c06-710c-401b-803e-9cc18aa1b4b6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mkjd2" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.705873 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.706072 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-vktqs"] Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.706565 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567748-zv7h8"] Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.706953 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567745-czvxj"] Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.707482 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ce8529fe-4546-466d-bb07-3ee73cf1bc1f-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-m7q8h\" (UID: \"ce8529fe-4546-466d-bb07-3ee73cf1bc1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7q8h" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.707547 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-864h4"] Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.707567 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-plggv"] Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.707579 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-flxks"] Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.707603 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-hncrq"] Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.708167 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-mkjd2"] Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.708186 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-2ptjk"] Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.708199 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-m7q8h"] Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.708210 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-cd44l"] Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.708223 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-p5r46"] Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.708235 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-n9mn4"] Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.708247 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6tr2m"] Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.708259 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-52mpk"] Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.708269 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mlpft"] Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.708279 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n6cmp"] Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.708341 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cd1c8c06-710c-401b-803e-9cc18aa1b4b6-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-mkjd2\" (UID: \"cd1c8c06-710c-401b-803e-9cc18aa1b4b6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mkjd2" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.708360 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-hncrq" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.708432 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-djlmv"] Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.708607 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-n8ttn" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.708813 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-qflxg" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.708985 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/588eb87c-d2c0-45fb-a0f7-33de36d5d745-client-ca\") pod \"route-controller-manager-6576b87f9c-plggv\" (UID: \"588eb87c-d2c0-45fb-a0f7-33de36d5d745\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-plggv" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.709007 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vktqs" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.709191 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567748-zv7h8" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.723019 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6a042fb-9acf-4d69-9583-23ccf76753f8-config\") pod \"machine-approver-56656f9798-84ftw\" (UID: \"e6a042fb-9acf-4d69-9583-23ccf76753f8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-84ftw" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.723546 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/e6a042fb-9acf-4d69-9583-23ccf76753f8-machine-approver-tls\") pod \"machine-approver-56656f9798-84ftw\" (UID: \"e6a042fb-9acf-4d69-9583-23ccf76753f8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-84ftw" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.724588 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/9c1c82f3-080b-47ea-93df-596d79aa2bf8-node-pullsecrets\") pod \"apiserver-76f77b778f-864h4\" (UID: \"9c1c82f3-080b-47ea-93df-596d79aa2bf8\") " pod="openshift-apiserver/apiserver-76f77b778f-864h4" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.725942 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/9c1c82f3-080b-47ea-93df-596d79aa2bf8-image-import-ca\") pod \"apiserver-76f77b778f-864h4\" (UID: \"9c1c82f3-080b-47ea-93df-596d79aa2bf8\") " pod="openshift-apiserver/apiserver-76f77b778f-864h4" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.728920 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ce8529fe-4546-466d-bb07-3ee73cf1bc1f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-m7q8h\" (UID: \"ce8529fe-4546-466d-bb07-3ee73cf1bc1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7q8h" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.730144 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd1c8c06-710c-401b-803e-9cc18aa1b4b6-serving-cert\") pod \"controller-manager-879f6c89f-mkjd2\" (UID: \"cd1c8c06-710c-401b-803e-9cc18aa1b4b6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mkjd2" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.730518 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ce8529fe-4546-466d-bb07-3ee73cf1bc1f-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-m7q8h\" (UID: \"ce8529fe-4546-466d-bb07-3ee73cf1bc1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7q8h" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.730517 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9c1c82f3-080b-47ea-93df-596d79aa2bf8-etcd-client\") pod \"apiserver-76f77b778f-864h4\" (UID: \"9c1c82f3-080b-47ea-93df-596d79aa2bf8\") " pod="openshift-apiserver/apiserver-76f77b778f-864h4" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.730988 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/db754bd5-f41e-4467-ba57-f89651b75dd1-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-n6cmp\" (UID: \"db754bd5-f41e-4467-ba57-f89651b75dd1\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n6cmp" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.734023 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ce8529fe-4546-466d-bb07-3ee73cf1bc1f-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-m7q8h\" (UID: \"ce8529fe-4546-466d-bb07-3ee73cf1bc1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7q8h" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.734145 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/9c1c82f3-080b-47ea-93df-596d79aa2bf8-encryption-config\") pod \"apiserver-76f77b778f-864h4\" (UID: \"9c1c82f3-080b-47ea-93df-596d79aa2bf8\") " pod="openshift-apiserver/apiserver-76f77b778f-864h4" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.736238 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ce8529fe-4546-466d-bb07-3ee73cf1bc1f-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-m7q8h\" (UID: \"ce8529fe-4546-466d-bb07-3ee73cf1bc1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7q8h" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.736285 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ce8529fe-4546-466d-bb07-3ee73cf1bc1f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-m7q8h\" (UID: \"ce8529fe-4546-466d-bb07-3ee73cf1bc1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7q8h" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.736946 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ce8529fe-4546-466d-bb07-3ee73cf1bc1f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-m7q8h\" (UID: \"ce8529fe-4546-466d-bb07-3ee73cf1bc1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7q8h" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.701776 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fb258" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.737719 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c1c82f3-080b-47ea-93df-596d79aa2bf8-config\") pod \"apiserver-76f77b778f-864h4\" (UID: \"9c1c82f3-080b-47ea-93df-596d79aa2bf8\") " pod="openshift-apiserver/apiserver-76f77b778f-864h4" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.737777 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9c1c82f3-080b-47ea-93df-596d79aa2bf8-audit-dir\") pod \"apiserver-76f77b778f-864h4\" (UID: \"9c1c82f3-080b-47ea-93df-596d79aa2bf8\") " pod="openshift-apiserver/apiserver-76f77b778f-864h4" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.738251 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd1c8c06-710c-401b-803e-9cc18aa1b4b6-config\") pod \"controller-manager-879f6c89f-mkjd2\" (UID: \"cd1c8c06-710c-401b-803e-9cc18aa1b4b6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mkjd2" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.738255 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c1c82f3-080b-47ea-93df-596d79aa2bf8-serving-cert\") pod \"apiserver-76f77b778f-864h4\" (UID: \"9c1c82f3-080b-47ea-93df-596d79aa2bf8\") " pod="openshift-apiserver/apiserver-76f77b778f-864h4" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.738356 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ce8529fe-4546-466d-bb07-3ee73cf1bc1f-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-m7q8h\" (UID: \"ce8529fe-4546-466d-bb07-3ee73cf1bc1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7q8h" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.738806 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/9c1c82f3-080b-47ea-93df-596d79aa2bf8-audit\") pod \"apiserver-76f77b778f-864h4\" (UID: \"9c1c82f3-080b-47ea-93df-596d79aa2bf8\") " pod="openshift-apiserver/apiserver-76f77b778f-864h4" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.738994 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e6a042fb-9acf-4d69-9583-23ccf76753f8-auth-proxy-config\") pod \"machine-approver-56656f9798-84ftw\" (UID: \"e6a042fb-9acf-4d69-9583-23ccf76753f8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-84ftw" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.739615 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/588eb87c-d2c0-45fb-a0f7-33de36d5d745-serving-cert\") pod \"route-controller-manager-6576b87f9c-plggv\" (UID: \"588eb87c-d2c0-45fb-a0f7-33de36d5d745\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-plggv" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.740085 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ce8529fe-4546-466d-bb07-3ee73cf1bc1f-audit-policies\") pod \"oauth-openshift-558db77b4-m7q8h\" (UID: \"ce8529fe-4546-466d-bb07-3ee73cf1bc1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7q8h" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.741758 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567745-czvxj" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.741775 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ce8529fe-4546-466d-bb07-3ee73cf1bc1f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-m7q8h\" (UID: \"ce8529fe-4546-466d-bb07-3ee73cf1bc1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7q8h" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.743326 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-97584"] Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.753291 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-m9xxt"] Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.753692 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/1ee95c71-cb75-4357-aeff-c0417a0c6eb3-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-flxks\" (UID: \"1ee95c71-cb75-4357-aeff-c0417a0c6eb3\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-flxks" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.755230 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2tvkz"] Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.755944 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.757689 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.758752 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-l8n7r"] Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.765668 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-skpjj"] Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.768988 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-646sv"] Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.770422 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2ckzq"] Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.772188 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-2jl6z"] Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.773090 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dk5lq"] Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.774345 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-n8ttn"] Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.775758 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-pcwhp"] Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.777466 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-5bwhq"] Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.778387 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-clz2m"] Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.779328 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.779661 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-tb9dl"] Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.781208 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fb258"] Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.782225 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-j8qrk"] Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.783301 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-tg9t2"] Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.784443 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-jjffn"] Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.785759 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-pjvx8"] Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.786911 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567748-zv7h8"] Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.787950 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-vktqs"] Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.789181 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-qflxg"] Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.790395 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567745-czvxj"] Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.791425 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-hncrq"] Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.792451 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-qsvzc"] Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.793325 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-qsvzc" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.793517 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-2pgbl"] Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.794816 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-2pgbl" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.795002 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-2pgbl"] Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.796103 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-k7m7b"] Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.797383 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-k7m7b"] Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.797485 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-k7m7b" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.798481 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.818747 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.838359 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.858421 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.879553 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.898138 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.920707 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.938652 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.958563 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 21 03:49:43 crc kubenswrapper[4685]: I0321 03:49:43.978240 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 21 03:49:44 crc kubenswrapper[4685]: I0321 03:49:44.012753 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 21 03:49:44 crc kubenswrapper[4685]: I0321 03:49:44.018535 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 21 03:49:44 crc kubenswrapper[4685]: I0321 03:49:44.040029 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 21 03:49:44 crc kubenswrapper[4685]: I0321 03:49:44.059818 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 21 03:49:44 crc kubenswrapper[4685]: I0321 03:49:44.080942 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 21 03:49:44 crc kubenswrapper[4685]: I0321 03:49:44.119017 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 21 03:49:44 crc kubenswrapper[4685]: I0321 03:49:44.139434 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 21 03:49:44 crc kubenswrapper[4685]: I0321 03:49:44.158293 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 21 03:49:44 crc kubenswrapper[4685]: I0321 03:49:44.179274 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 21 03:49:44 crc kubenswrapper[4685]: I0321 03:49:44.198906 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 21 03:49:44 crc kubenswrapper[4685]: I0321 03:49:44.219275 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 21 03:49:44 crc kubenswrapper[4685]: I0321 03:49:44.239118 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 21 03:49:44 crc kubenswrapper[4685]: I0321 03:49:44.259172 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 21 03:49:44 crc kubenswrapper[4685]: I0321 03:49:44.278979 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 21 03:49:44 crc kubenswrapper[4685]: I0321 03:49:44.299080 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 21 03:49:44 crc kubenswrapper[4685]: I0321 03:49:44.318885 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 21 03:49:44 crc kubenswrapper[4685]: I0321 03:49:44.339081 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 21 03:49:44 crc kubenswrapper[4685]: I0321 03:49:44.358648 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 21 03:49:44 crc kubenswrapper[4685]: I0321 03:49:44.378823 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 21 03:49:44 crc kubenswrapper[4685]: I0321 03:49:44.399389 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 21 03:49:44 crc kubenswrapper[4685]: I0321 03:49:44.419107 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 21 03:49:44 crc kubenswrapper[4685]: I0321 03:49:44.439179 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 21 03:49:44 crc kubenswrapper[4685]: I0321 03:49:44.459438 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 21 03:49:44 crc kubenswrapper[4685]: I0321 03:49:44.479117 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 21 03:49:44 crc kubenswrapper[4685]: I0321 03:49:44.498787 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 21 03:49:44 crc kubenswrapper[4685]: I0321 03:49:44.519522 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 21 03:49:44 crc kubenswrapper[4685]: I0321 03:49:44.539048 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 21 03:49:44 crc kubenswrapper[4685]: I0321 03:49:44.559473 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 21 03:49:44 crc kubenswrapper[4685]: I0321 03:49:44.579892 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 21 03:49:44 crc kubenswrapper[4685]: I0321 03:49:44.599442 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 21 03:49:44 crc kubenswrapper[4685]: I0321 03:49:44.619258 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 21 03:49:44 crc kubenswrapper[4685]: I0321 03:49:44.638805 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 21 03:49:44 crc kubenswrapper[4685]: I0321 03:49:44.658492 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 21 03:49:44 crc kubenswrapper[4685]: I0321 03:49:44.679079 4685 request.go:700] Waited for 1.007866583s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/secrets?fieldSelector=metadata.name%3Dcluster-image-registry-operator-dockercfg-m4qtx&limit=500&resourceVersion=0 Mar 21 03:49:44 crc kubenswrapper[4685]: I0321 03:49:44.681198 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 21 03:49:44 crc kubenswrapper[4685]: I0321 03:49:44.699159 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 21 03:49:44 crc kubenswrapper[4685]: I0321 03:49:44.719657 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 21 03:49:44 crc kubenswrapper[4685]: I0321 03:49:44.739348 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 21 03:49:44 crc kubenswrapper[4685]: I0321 03:49:44.780256 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 21 03:49:44 crc kubenswrapper[4685]: I0321 03:49:44.787772 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2w7ld\" (UniqueName: \"kubernetes.io/projected/db754bd5-f41e-4467-ba57-f89651b75dd1-kube-api-access-2w7ld\") pod \"openshift-apiserver-operator-796bbdcf4f-n6cmp\" (UID: \"db754bd5-f41e-4467-ba57-f89651b75dd1\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n6cmp" Mar 21 03:49:44 crc kubenswrapper[4685]: I0321 03:49:44.799916 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 21 03:49:44 crc kubenswrapper[4685]: I0321 03:49:44.818651 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 21 03:49:44 crc kubenswrapper[4685]: I0321 03:49:44.839651 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 21 03:49:44 crc kubenswrapper[4685]: I0321 03:49:44.859181 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 21 03:49:44 crc kubenswrapper[4685]: I0321 03:49:44.881543 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 21 03:49:44 crc kubenswrapper[4685]: I0321 03:49:44.895343 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n6cmp" Mar 21 03:49:44 crc kubenswrapper[4685]: I0321 03:49:44.898806 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 21 03:49:44 crc kubenswrapper[4685]: I0321 03:49:44.928186 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 21 03:49:44 crc kubenswrapper[4685]: I0321 03:49:44.940243 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 21 03:49:44 crc kubenswrapper[4685]: I0321 03:49:44.959953 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 21 03:49:44 crc kubenswrapper[4685]: I0321 03:49:44.980648 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.000054 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.039710 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.040909 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6n67\" (UniqueName: \"kubernetes.io/projected/e6a042fb-9acf-4d69-9583-23ccf76753f8-kube-api-access-k6n67\") pod \"machine-approver-56656f9798-84ftw\" (UID: \"e6a042fb-9acf-4d69-9583-23ccf76753f8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-84ftw" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.060011 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.079474 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.087193 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-84ftw" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.100257 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 21 03:49:45 crc kubenswrapper[4685]: W0321 03:49:45.104369 4685 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6a042fb_9acf_4d69_9583_23ccf76753f8.slice/crio-07f50fbbce8000b2dbb13eca1abf41793dcfa523a52c418c67cbda0656ff17a3 WatchSource:0}: Error finding container 07f50fbbce8000b2dbb13eca1abf41793dcfa523a52c418c67cbda0656ff17a3: Status 404 returned error can't find the container with id 07f50fbbce8000b2dbb13eca1abf41793dcfa523a52c418c67cbda0656ff17a3 Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.139202 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.139548 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6b5xg\" (UniqueName: \"kubernetes.io/projected/cd1c8c06-710c-401b-803e-9cc18aa1b4b6-kube-api-access-6b5xg\") pod \"controller-manager-879f6c89f-mkjd2\" (UID: \"cd1c8c06-710c-401b-803e-9cc18aa1b4b6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mkjd2" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.158281 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.193936 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hf5rg\" (UniqueName: \"kubernetes.io/projected/ce8529fe-4546-466d-bb07-3ee73cf1bc1f-kube-api-access-hf5rg\") pod \"oauth-openshift-558db77b4-m7q8h\" (UID: \"ce8529fe-4546-466d-bb07-3ee73cf1bc1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7q8h" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.222892 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n6cmp"] Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.223529 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbxpx\" (UniqueName: \"kubernetes.io/projected/588eb87c-d2c0-45fb-a0f7-33de36d5d745-kube-api-access-hbxpx\") pod \"route-controller-manager-6576b87f9c-plggv\" (UID: \"588eb87c-d2c0-45fb-a0f7-33de36d5d745\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-plggv" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.234397 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzbmm\" (UniqueName: \"kubernetes.io/projected/9c1c82f3-080b-47ea-93df-596d79aa2bf8-kube-api-access-xzbmm\") pod \"apiserver-76f77b778f-864h4\" (UID: \"9c1c82f3-080b-47ea-93df-596d79aa2bf8\") " pod="openshift-apiserver/apiserver-76f77b778f-864h4" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.238909 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.258862 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.278767 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.299074 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.311072 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-864h4" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.318664 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.323637 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-plggv" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.339668 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.359033 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.379769 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.399877 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.419078 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.431154 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-mkjd2" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.439317 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.444635 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-m7q8h" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.478157 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-864h4"] Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.483498 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fw6qm\" (UniqueName: \"kubernetes.io/projected/1ee95c71-cb75-4357-aeff-c0417a0c6eb3-kube-api-access-fw6qm\") pod \"machine-api-operator-5694c8668f-flxks\" (UID: \"1ee95c71-cb75-4357-aeff-c0417a0c6eb3\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-flxks" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.485007 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 21 03:49:45 crc kubenswrapper[4685]: W0321 03:49:45.489441 4685 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c1c82f3_080b_47ea_93df_596d79aa2bf8.slice/crio-14fa168a22ef42f84569389e4f9d0673fc65e52797dd52704a04a290f793fa8d WatchSource:0}: Error finding container 14fa168a22ef42f84569389e4f9d0673fc65e52797dd52704a04a290f793fa8d: Status 404 returned error can't find the container with id 14fa168a22ef42f84569389e4f9d0673fc65e52797dd52704a04a290f793fa8d Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.501927 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.519210 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.523857 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-plggv"] Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.557988 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.579277 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.597926 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n6cmp" event={"ID":"db754bd5-f41e-4467-ba57-f89651b75dd1","Type":"ContainerStarted","Data":"49f956632877737ed4b1329b2f25de6d1226dfad21faacbbcd8347877f6be0b2"} Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.597978 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n6cmp" event={"ID":"db754bd5-f41e-4467-ba57-f89651b75dd1","Type":"ContainerStarted","Data":"0273c8e3c968576f8f74a1905cdd8ed47b4ad74b8e579766929e63859721b63b"} Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.600711 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.601657 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-plggv" event={"ID":"588eb87c-d2c0-45fb-a0f7-33de36d5d745","Type":"ContainerStarted","Data":"83be5ab2f9a927228dede71a95aaf214baa9ba4f46e25f24f7cc896ee00a97d6"} Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.603193 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-864h4" event={"ID":"9c1c82f3-080b-47ea-93df-596d79aa2bf8","Type":"ContainerStarted","Data":"14fa168a22ef42f84569389e4f9d0673fc65e52797dd52704a04a290f793fa8d"} Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.609629 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-mkjd2"] Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.609802 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-84ftw" event={"ID":"e6a042fb-9acf-4d69-9583-23ccf76753f8","Type":"ContainerStarted","Data":"e5df9ac38a7ce953be2da97275718f9a0eb442fee9942e4d04f43f81986fa1d4"} Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.609849 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-84ftw" event={"ID":"e6a042fb-9acf-4d69-9583-23ccf76753f8","Type":"ContainerStarted","Data":"07f50fbbce8000b2dbb13eca1abf41793dcfa523a52c418c67cbda0656ff17a3"} Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.619552 4685 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.639707 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-m7q8h"] Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.640476 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.662637 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.667642 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-flxks" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.680909 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.697701 4685 request.go:700] Waited for 1.900042101s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress-canary/secrets?fieldSelector=metadata.name%3Dcanary-serving-cert&limit=500&resourceVersion=0 Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.701232 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.719659 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.742669 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.832105 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2efbba77-f5bd-48cb-a790-f4c3564acb75-metrics-tls\") pod \"ingress-operator-5b745b69d9-jjffn\" (UID: \"2efbba77-f5bd-48cb-a790-f4c3564acb75\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jjffn" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.832466 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kv44\" (UniqueName: \"kubernetes.io/projected/2efbba77-f5bd-48cb-a790-f4c3564acb75-kube-api-access-9kv44\") pod \"ingress-operator-5b745b69d9-jjffn\" (UID: \"2efbba77-f5bd-48cb-a790-f4c3564acb75\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jjffn" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.832489 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7c21a705-7e47-4418-803a-41a459acef90-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-6tr2m\" (UID: \"7c21a705-7e47-4418-803a-41a459acef90\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6tr2m" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.832548 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/b13d8427-dcef-4925-92b6-0e6bf1aca8c8-stats-auth\") pod \"router-default-5444994796-vds97\" (UID: \"b13d8427-dcef-4925-92b6-0e6bf1aca8c8\") " pod="openshift-ingress/router-default-5444994796-vds97" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.832582 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ca5cb48c-43cc-428f-bf99-ba396d595e5c-encryption-config\") pod \"apiserver-7bbb656c7d-tb9dl\" (UID: \"ca5cb48c-43cc-428f-bf99-ba396d595e5c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tb9dl" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.832737 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/49f645cb-7805-4ded-9f3e-d43bdb3801a6-console-config\") pod \"console-f9d7485db-j8qrk\" (UID: \"49f645cb-7805-4ded-9f3e-d43bdb3801a6\") " pod="openshift-console/console-f9d7485db-j8qrk" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.832873 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqdgc\" (UniqueName: \"kubernetes.io/projected/5a823511-d878-4e6d-acda-4202e00e3aab-kube-api-access-tqdgc\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.832911 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dd62f2fc-10ef-4a6e-80f1-0813f3a681bd-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-p5r46\" (UID: \"dd62f2fc-10ef-4a6e-80f1-0813f3a681bd\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-p5r46" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.832932 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/49f645cb-7805-4ded-9f3e-d43bdb3801a6-console-oauth-config\") pod \"console-f9d7485db-j8qrk\" (UID: \"49f645cb-7805-4ded-9f3e-d43bdb3801a6\") " pod="openshift-console/console-f9d7485db-j8qrk" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.832953 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca5cb48c-43cc-428f-bf99-ba396d595e5c-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-tb9dl\" (UID: \"ca5cb48c-43cc-428f-bf99-ba396d595e5c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tb9dl" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.833050 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.833109 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9e7908ff-df99-4827-8a79-ce0b7f0f5d80-serving-cert\") pod \"etcd-operator-b45778765-97584\" (UID: \"9e7908ff-df99-4827-8a79-ce0b7f0f5d80\") " pod="openshift-etcd-operator/etcd-operator-b45778765-97584" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.833162 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/844b8e4f-7ba3-4a52-b6a7-0e8ea0d78003-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-2tvkz\" (UID: \"844b8e4f-7ba3-4a52-b6a7-0e8ea0d78003\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2tvkz" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.833190 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksnq6\" (UniqueName: \"kubernetes.io/projected/b13d8427-dcef-4925-92b6-0e6bf1aca8c8-kube-api-access-ksnq6\") pod \"router-default-5444994796-vds97\" (UID: \"b13d8427-dcef-4925-92b6-0e6bf1aca8c8\") " pod="openshift-ingress/router-default-5444994796-vds97" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.833217 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2k542\" (UniqueName: \"kubernetes.io/projected/bb0bd79d-5459-4b35-adb8-eca9d0fb069c-kube-api-access-2k542\") pod \"authentication-operator-69f744f599-cd44l\" (UID: \"bb0bd79d-5459-4b35-adb8-eca9d0fb069c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cd44l" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.833278 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtxr4\" (UniqueName: \"kubernetes.io/projected/2663314f-3e35-4ab4-b9b9-28e829cde5de-kube-api-access-dtxr4\") pod \"openshift-controller-manager-operator-756b6f6bc6-2ckzq\" (UID: \"2663314f-3e35-4ab4-b9b9-28e829cde5de\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2ckzq" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.833319 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2s67x\" (UniqueName: \"kubernetes.io/projected/9e7908ff-df99-4827-8a79-ce0b7f0f5d80-kube-api-access-2s67x\") pod \"etcd-operator-b45778765-97584\" (UID: \"9e7908ff-df99-4827-8a79-ce0b7f0f5d80\") " pod="openshift-etcd-operator/etcd-operator-b45778765-97584" Mar 21 03:49:45 crc kubenswrapper[4685]: E0321 03:49:45.833404 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 03:49:46.33338788 +0000 UTC m=+218.810456672 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pjvx8" (UID: "5a823511-d878-4e6d-acda-4202e00e3aab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.833517 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2efbba77-f5bd-48cb-a790-f4c3564acb75-trusted-ca\") pod \"ingress-operator-5b745b69d9-jjffn\" (UID: \"2efbba77-f5bd-48cb-a790-f4c3564acb75\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jjffn" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.833557 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b429d7c-3700-4d9f-b6b2-554219223515-serving-cert\") pod \"console-operator-58897d9998-djlmv\" (UID: \"7b429d7c-3700-4d9f-b6b2-554219223515\") " pod="openshift-console-operator/console-operator-58897d9998-djlmv" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.833663 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb0bd79d-5459-4b35-adb8-eca9d0fb069c-serving-cert\") pod \"authentication-operator-69f744f599-cd44l\" (UID: \"bb0bd79d-5459-4b35-adb8-eca9d0fb069c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cd44l" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.833698 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7c21a705-7e47-4418-803a-41a459acef90-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-6tr2m\" (UID: \"7c21a705-7e47-4418-803a-41a459acef90\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6tr2m" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.833788 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5a823511-d878-4e6d-acda-4202e00e3aab-bound-sa-token\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.833818 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e7908ff-df99-4827-8a79-ce0b7f0f5d80-config\") pod \"etcd-operator-b45778765-97584\" (UID: \"9e7908ff-df99-4827-8a79-ce0b7f0f5d80\") " pod="openshift-etcd-operator/etcd-operator-b45778765-97584" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.833857 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c21a705-7e47-4418-803a-41a459acef90-config\") pod \"kube-apiserver-operator-766d6c64bb-6tr2m\" (UID: \"7c21a705-7e47-4418-803a-41a459acef90\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6tr2m" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.833874 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/49f645cb-7805-4ded-9f3e-d43bdb3801a6-oauth-serving-cert\") pod \"console-f9d7485db-j8qrk\" (UID: \"49f645cb-7805-4ded-9f3e-d43bdb3801a6\") " pod="openshift-console/console-f9d7485db-j8qrk" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.833895 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2663314f-3e35-4ab4-b9b9-28e829cde5de-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-2ckzq\" (UID: \"2663314f-3e35-4ab4-b9b9-28e829cde5de\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2ckzq" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.833962 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5a823511-d878-4e6d-acda-4202e00e3aab-trusted-ca\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.834017 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/8f518bb5-0336-44df-ac60-174c8426974e-available-featuregates\") pod \"openshift-config-operator-7777fb866f-2ptjk\" (UID: \"8f518bb5-0336-44df-ac60-174c8426974e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2ptjk" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.834061 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2663314f-3e35-4ab4-b9b9-28e829cde5de-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-2ckzq\" (UID: \"2663314f-3e35-4ab4-b9b9-28e829cde5de\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2ckzq" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.834092 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb0bd79d-5459-4b35-adb8-eca9d0fb069c-config\") pod \"authentication-operator-69f744f599-cd44l\" (UID: \"bb0bd79d-5459-4b35-adb8-eca9d0fb069c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cd44l" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.834131 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9e7908ff-df99-4827-8a79-ce0b7f0f5d80-etcd-client\") pod \"etcd-operator-b45778765-97584\" (UID: \"9e7908ff-df99-4827-8a79-ce0b7f0f5d80\") " pod="openshift-etcd-operator/etcd-operator-b45778765-97584" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.834189 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d841e9a0-5181-40b8-9374-daa38341c4ff-metrics-tls\") pod \"dns-operator-744455d44c-52mpk\" (UID: \"d841e9a0-5181-40b8-9374-daa38341c4ff\") " pod="openshift-dns-operator/dns-operator-744455d44c-52mpk" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.834222 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ca5cb48c-43cc-428f-bf99-ba396d595e5c-audit-dir\") pod \"apiserver-7bbb656c7d-tb9dl\" (UID: \"ca5cb48c-43cc-428f-bf99-ba396d595e5c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tb9dl" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.834285 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b13d8427-dcef-4925-92b6-0e6bf1aca8c8-service-ca-bundle\") pod \"router-default-5444994796-vds97\" (UID: \"b13d8427-dcef-4925-92b6-0e6bf1aca8c8\") " pod="openshift-ingress/router-default-5444994796-vds97" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.834302 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-flxks"] Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.834345 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49f645cb-7805-4ded-9f3e-d43bdb3801a6-trusted-ca-bundle\") pod \"console-f9d7485db-j8qrk\" (UID: \"49f645cb-7805-4ded-9f3e-d43bdb3801a6\") " pod="openshift-console/console-f9d7485db-j8qrk" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.834374 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2c9z\" (UniqueName: \"kubernetes.io/projected/7b429d7c-3700-4d9f-b6b2-554219223515-kube-api-access-j2c9z\") pod \"console-operator-58897d9998-djlmv\" (UID: \"7b429d7c-3700-4d9f-b6b2-554219223515\") " pod="openshift-console-operator/console-operator-58897d9998-djlmv" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.834411 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvp2w\" (UniqueName: \"kubernetes.io/projected/8f518bb5-0336-44df-ac60-174c8426974e-kube-api-access-xvp2w\") pod \"openshift-config-operator-7777fb866f-2ptjk\" (UID: \"8f518bb5-0336-44df-ac60-174c8426974e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2ptjk" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.834449 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5a823511-d878-4e6d-acda-4202e00e3aab-installation-pull-secrets\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.834469 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dd62f2fc-10ef-4a6e-80f1-0813f3a681bd-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-p5r46\" (UID: \"dd62f2fc-10ef-4a6e-80f1-0813f3a681bd\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-p5r46" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.834500 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frlpd\" (UniqueName: \"kubernetes.io/projected/844b8e4f-7ba3-4a52-b6a7-0e8ea0d78003-kube-api-access-frlpd\") pod \"cluster-samples-operator-665b6dd947-2tvkz\" (UID: \"844b8e4f-7ba3-4a52-b6a7-0e8ea0d78003\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2tvkz" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.834525 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgn8b\" (UniqueName: \"kubernetes.io/projected/ca5cb48c-43cc-428f-bf99-ba396d595e5c-kube-api-access-rgn8b\") pod \"apiserver-7bbb656c7d-tb9dl\" (UID: \"ca5cb48c-43cc-428f-bf99-ba396d595e5c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tb9dl" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.834560 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5a823511-d878-4e6d-acda-4202e00e3aab-ca-trust-extracted\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.834582 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb0bd79d-5459-4b35-adb8-eca9d0fb069c-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-cd44l\" (UID: \"bb0bd79d-5459-4b35-adb8-eca9d0fb069c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cd44l" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.834645 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb0bd79d-5459-4b35-adb8-eca9d0fb069c-service-ca-bundle\") pod \"authentication-operator-69f744f599-cd44l\" (UID: \"bb0bd79d-5459-4b35-adb8-eca9d0fb069c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cd44l" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.834702 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czl5l\" (UniqueName: \"kubernetes.io/projected/49f645cb-7805-4ded-9f3e-d43bdb3801a6-kube-api-access-czl5l\") pod \"console-f9d7485db-j8qrk\" (UID: \"49f645cb-7805-4ded-9f3e-d43bdb3801a6\") " pod="openshift-console/console-f9d7485db-j8qrk" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.834739 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ca5cb48c-43cc-428f-bf99-ba396d595e5c-audit-policies\") pod \"apiserver-7bbb656c7d-tb9dl\" (UID: \"ca5cb48c-43cc-428f-bf99-ba396d595e5c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tb9dl" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.834790 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2efbba77-f5bd-48cb-a790-f4c3564acb75-bound-sa-token\") pod \"ingress-operator-5b745b69d9-jjffn\" (UID: \"2efbba77-f5bd-48cb-a790-f4c3564acb75\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jjffn" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.834826 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/9e7908ff-df99-4827-8a79-ce0b7f0f5d80-etcd-ca\") pod \"etcd-operator-b45778765-97584\" (UID: \"9e7908ff-df99-4827-8a79-ce0b7f0f5d80\") " pod="openshift-etcd-operator/etcd-operator-b45778765-97584" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.834906 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmxjl\" (UniqueName: \"kubernetes.io/projected/eb165aaf-36a6-4965-bebf-6a40e1695b94-kube-api-access-qmxjl\") pod \"downloads-7954f5f757-clz2m\" (UID: \"eb165aaf-36a6-4965-bebf-6a40e1695b94\") " pod="openshift-console/downloads-7954f5f757-clz2m" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.835053 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd62f2fc-10ef-4a6e-80f1-0813f3a681bd-config\") pod \"kube-controller-manager-operator-78b949d7b-p5r46\" (UID: \"dd62f2fc-10ef-4a6e-80f1-0813f3a681bd\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-p5r46" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.835187 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xtkn\" (UniqueName: \"kubernetes.io/projected/d841e9a0-5181-40b8-9374-daa38341c4ff-kube-api-access-6xtkn\") pod \"dns-operator-744455d44c-52mpk\" (UID: \"d841e9a0-5181-40b8-9374-daa38341c4ff\") " pod="openshift-dns-operator/dns-operator-744455d44c-52mpk" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.835281 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ca5cb48c-43cc-428f-bf99-ba396d595e5c-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-tb9dl\" (UID: \"ca5cb48c-43cc-428f-bf99-ba396d595e5c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tb9dl" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.835329 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/9e7908ff-df99-4827-8a79-ce0b7f0f5d80-etcd-service-ca\") pod \"etcd-operator-b45778765-97584\" (UID: \"9e7908ff-df99-4827-8a79-ce0b7f0f5d80\") " pod="openshift-etcd-operator/etcd-operator-b45778765-97584" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.835347 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7b429d7c-3700-4d9f-b6b2-554219223515-trusted-ca\") pod \"console-operator-58897d9998-djlmv\" (UID: \"7b429d7c-3700-4d9f-b6b2-554219223515\") " pod="openshift-console-operator/console-operator-58897d9998-djlmv" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.835440 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f518bb5-0336-44df-ac60-174c8426974e-serving-cert\") pod \"openshift-config-operator-7777fb866f-2ptjk\" (UID: \"8f518bb5-0336-44df-ac60-174c8426974e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2ptjk" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.835541 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/b13d8427-dcef-4925-92b6-0e6bf1aca8c8-default-certificate\") pod \"router-default-5444994796-vds97\" (UID: \"b13d8427-dcef-4925-92b6-0e6bf1aca8c8\") " pod="openshift-ingress/router-default-5444994796-vds97" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.835588 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/49f645cb-7805-4ded-9f3e-d43bdb3801a6-console-serving-cert\") pod \"console-f9d7485db-j8qrk\" (UID: \"49f645cb-7805-4ded-9f3e-d43bdb3801a6\") " pod="openshift-console/console-f9d7485db-j8qrk" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.835614 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca5cb48c-43cc-428f-bf99-ba396d595e5c-serving-cert\") pod \"apiserver-7bbb656c7d-tb9dl\" (UID: \"ca5cb48c-43cc-428f-bf99-ba396d595e5c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tb9dl" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.835636 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ca5cb48c-43cc-428f-bf99-ba396d595e5c-etcd-client\") pod \"apiserver-7bbb656c7d-tb9dl\" (UID: \"ca5cb48c-43cc-428f-bf99-ba396d595e5c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tb9dl" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.835657 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b429d7c-3700-4d9f-b6b2-554219223515-config\") pod \"console-operator-58897d9998-djlmv\" (UID: \"7b429d7c-3700-4d9f-b6b2-554219223515\") " pod="openshift-console-operator/console-operator-58897d9998-djlmv" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.835804 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/49f645cb-7805-4ded-9f3e-d43bdb3801a6-service-ca\") pod \"console-f9d7485db-j8qrk\" (UID: \"49f645cb-7805-4ded-9f3e-d43bdb3801a6\") " pod="openshift-console/console-f9d7485db-j8qrk" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.835863 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5a823511-d878-4e6d-acda-4202e00e3aab-registry-certificates\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.835892 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b13d8427-dcef-4925-92b6-0e6bf1aca8c8-metrics-certs\") pod \"router-default-5444994796-vds97\" (UID: \"b13d8427-dcef-4925-92b6-0e6bf1aca8c8\") " pod="openshift-ingress/router-default-5444994796-vds97" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.835976 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5a823511-d878-4e6d-acda-4202e00e3aab-registry-tls\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:45 crc kubenswrapper[4685]: W0321 03:49:45.843585 4685 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ee95c71_cb75_4357_aeff_c0417a0c6eb3.slice/crio-7ac29acaf92f0b10e901f5ccbd8418d1717025a63e3d5266f99a800bb30ba95e WatchSource:0}: Error finding container 7ac29acaf92f0b10e901f5ccbd8418d1717025a63e3d5266f99a800bb30ba95e: Status 404 returned error can't find the container with id 7ac29acaf92f0b10e901f5ccbd8418d1717025a63e3d5266f99a800bb30ba95e Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.936724 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 03:49:45 crc kubenswrapper[4685]: E0321 03:49:45.936904 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 03:49:46.436887046 +0000 UTC m=+218.913955838 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.936979 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5a823511-d878-4e6d-acda-4202e00e3aab-registry-tls\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.937022 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/38f42eb2-0cf6-4c5b-8159-a89e22404a73-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-n9mn4\" (UID: \"38f42eb2-0cf6-4c5b-8159-a89e22404a73\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-n9mn4" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.937044 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kv44\" (UniqueName: \"kubernetes.io/projected/2efbba77-f5bd-48cb-a790-f4c3564acb75-kube-api-access-9kv44\") pod \"ingress-operator-5b745b69d9-jjffn\" (UID: \"2efbba77-f5bd-48cb-a790-f4c3564acb75\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jjffn" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.937065 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmb7h\" (UniqueName: \"kubernetes.io/projected/b7a27ba4-0f5f-4ad6-9883-8698fd160802-kube-api-access-gmb7h\") pod \"cluster-image-registry-operator-dc59b4c8b-m9xxt\" (UID: \"b7a27ba4-0f5f-4ad6-9883-8698fd160802\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-m9xxt" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.937278 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9ffg\" (UniqueName: \"kubernetes.io/projected/f17fa271-26a6-4620-930b-f30d50f3412b-kube-api-access-q9ffg\") pod \"packageserver-d55dfcdfc-skpjj\" (UID: \"f17fa271-26a6-4620-930b-f30d50f3412b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-skpjj" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.937302 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/b13d8427-dcef-4925-92b6-0e6bf1aca8c8-stats-auth\") pod \"router-default-5444994796-vds97\" (UID: \"b13d8427-dcef-4925-92b6-0e6bf1aca8c8\") " pod="openshift-ingress/router-default-5444994796-vds97" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.937318 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4cc9f216-4352-4bff-a0eb-48659e3a603d-serving-cert\") pod \"service-ca-operator-777779d784-vktqs\" (UID: \"4cc9f216-4352-4bff-a0eb-48659e3a603d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vktqs" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.937388 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fr6x\" (UniqueName: \"kubernetes.io/projected/b695978e-b67c-4812-9083-22538cdd3045-kube-api-access-2fr6x\") pod \"csi-hostpathplugin-2pgbl\" (UID: \"b695978e-b67c-4812-9083-22538cdd3045\") " pod="hostpath-provisioner/csi-hostpathplugin-2pgbl" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.937407 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/e3945afb-20d9-4312-b9bb-dbe5bc788cda-signing-key\") pod \"service-ca-9c57cc56f-qflxg\" (UID: \"e3945afb-20d9-4312-b9bb-dbe5bc788cda\") " pod="openshift-service-ca/service-ca-9c57cc56f-qflxg" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.937444 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.937462 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dd62f2fc-10ef-4a6e-80f1-0813f3a681bd-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-p5r46\" (UID: \"dd62f2fc-10ef-4a6e-80f1-0813f3a681bd\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-p5r46" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.937478 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9e7908ff-df99-4827-8a79-ce0b7f0f5d80-serving-cert\") pod \"etcd-operator-b45778765-97584\" (UID: \"9e7908ff-df99-4827-8a79-ce0b7f0f5d80\") " pod="openshift-etcd-operator/etcd-operator-b45778765-97584" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.937510 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/f17fa271-26a6-4620-930b-f30d50f3412b-tmpfs\") pod \"packageserver-d55dfcdfc-skpjj\" (UID: \"f17fa271-26a6-4620-930b-f30d50f3412b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-skpjj" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.937533 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0306bf0e-f0c1-4e47-b63e-909b979c5844-config-volume\") pod \"collect-profiles-29567745-czvxj\" (UID: \"0306bf0e-f0c1-4e47-b63e-909b979c5844\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567745-czvxj" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.937557 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/844b8e4f-7ba3-4a52-b6a7-0e8ea0d78003-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-2tvkz\" (UID: \"844b8e4f-7ba3-4a52-b6a7-0e8ea0d78003\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2tvkz" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.937590 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b7a27ba4-0f5f-4ad6-9883-8698fd160802-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-m9xxt\" (UID: \"b7a27ba4-0f5f-4ad6-9883-8698fd160802\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-m9xxt" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.937609 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ea531307-d9b1-4798-bac8-d34094d27e2c-srv-cert\") pod \"olm-operator-6b444d44fb-mlpft\" (UID: \"ea531307-d9b1-4798-bac8-d34094d27e2c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mlpft" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.937628 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtxr4\" (UniqueName: \"kubernetes.io/projected/2663314f-3e35-4ab4-b9b9-28e829cde5de-kube-api-access-dtxr4\") pod \"openshift-controller-manager-operator-756b6f6bc6-2ckzq\" (UID: \"2663314f-3e35-4ab4-b9b9-28e829cde5de\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2ckzq" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.937643 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2efbba77-f5bd-48cb-a790-f4c3564acb75-trusted-ca\") pod \"ingress-operator-5b745b69d9-jjffn\" (UID: \"2efbba77-f5bd-48cb-a790-f4c3564acb75\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jjffn" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.937673 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b429d7c-3700-4d9f-b6b2-554219223515-serving-cert\") pod \"console-operator-58897d9998-djlmv\" (UID: \"7b429d7c-3700-4d9f-b6b2-554219223515\") " pod="openshift-console-operator/console-operator-58897d9998-djlmv" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.937690 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgshs\" (UniqueName: \"kubernetes.io/projected/4ede3f08-f29b-4cb9-a96f-1c66239498f6-kube-api-access-kgshs\") pod \"auto-csr-approver-29567748-zv7h8\" (UID: \"4ede3f08-f29b-4cb9-a96f-1c66239498f6\") " pod="openshift-infra/auto-csr-approver-29567748-zv7h8" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.937707 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb0bd79d-5459-4b35-adb8-eca9d0fb069c-serving-cert\") pod \"authentication-operator-69f744f599-cd44l\" (UID: \"bb0bd79d-5459-4b35-adb8-eca9d0fb069c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cd44l" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.937722 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cc9f216-4352-4bff-a0eb-48659e3a603d-config\") pod \"service-ca-operator-777779d784-vktqs\" (UID: \"4cc9f216-4352-4bff-a0eb-48659e3a603d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vktqs" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.937758 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7c21a705-7e47-4418-803a-41a459acef90-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-6tr2m\" (UID: \"7c21a705-7e47-4418-803a-41a459acef90\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6tr2m" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.937773 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5a823511-d878-4e6d-acda-4202e00e3aab-bound-sa-token\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.937787 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e7908ff-df99-4827-8a79-ce0b7f0f5d80-config\") pod \"etcd-operator-b45778765-97584\" (UID: \"9e7908ff-df99-4827-8a79-ce0b7f0f5d80\") " pod="openshift-etcd-operator/etcd-operator-b45778765-97584" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.937802 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ea531307-d9b1-4798-bac8-d34094d27e2c-profile-collector-cert\") pod \"olm-operator-6b444d44fb-mlpft\" (UID: \"ea531307-d9b1-4798-bac8-d34094d27e2c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mlpft" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.937851 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7lvv\" (UniqueName: \"kubernetes.io/projected/c2d03fe9-f23e-4f3d-a281-b80ba7fa7f38-kube-api-access-w7lvv\") pod \"package-server-manager-789f6589d5-n8ttn\" (UID: \"c2d03fe9-f23e-4f3d-a281-b80ba7fa7f38\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-n8ttn" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.937869 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phckr\" (UniqueName: \"kubernetes.io/projected/e3945afb-20d9-4312-b9bb-dbe5bc788cda-kube-api-access-phckr\") pod \"service-ca-9c57cc56f-qflxg\" (UID: \"e3945afb-20d9-4312-b9bb-dbe5bc788cda\") " pod="openshift-service-ca/service-ca-9c57cc56f-qflxg" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.938712 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e7908ff-df99-4827-8a79-ce0b7f0f5d80-config\") pod \"etcd-operator-b45778765-97584\" (UID: \"9e7908ff-df99-4827-8a79-ce0b7f0f5d80\") " pod="openshift-etcd-operator/etcd-operator-b45778765-97584" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.939769 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2efbba77-f5bd-48cb-a790-f4c3564acb75-trusted-ca\") pod \"ingress-operator-5b745b69d9-jjffn\" (UID: \"2efbba77-f5bd-48cb-a790-f4c3564acb75\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jjffn" Mar 21 03:49:45 crc kubenswrapper[4685]: E0321 03:49:45.940126 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 03:49:46.440110194 +0000 UTC m=+218.917179176 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pjvx8" (UID: "5a823511-d878-4e6d-acda-4202e00e3aab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.937886 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/3a049943-f882-47d7-ac4e-703eebca8103-srv-cert\") pod \"catalog-operator-68c6474976-fb258\" (UID: \"3a049943-f882-47d7-ac4e-703eebca8103\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fb258" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.942761 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2663314f-3e35-4ab4-b9b9-28e829cde5de-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-2ckzq\" (UID: \"2663314f-3e35-4ab4-b9b9-28e829cde5de\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2ckzq" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.942867 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9e7908ff-df99-4827-8a79-ce0b7f0f5d80-etcd-client\") pod \"etcd-operator-b45778765-97584\" (UID: \"9e7908ff-df99-4827-8a79-ce0b7f0f5d80\") " pod="openshift-etcd-operator/etcd-operator-b45778765-97584" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.942930 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b13d8427-dcef-4925-92b6-0e6bf1aca8c8-service-ca-bundle\") pod \"router-default-5444994796-vds97\" (UID: \"b13d8427-dcef-4925-92b6-0e6bf1aca8c8\") " pod="openshift-ingress/router-default-5444994796-vds97" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.942959 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ce5ae64a-c99b-4700-aef6-23a77794a308-images\") pod \"machine-config-operator-74547568cd-2jl6z\" (UID: \"ce5ae64a-c99b-4700-aef6-23a77794a308\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2jl6z" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.943016 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49f645cb-7805-4ded-9f3e-d43bdb3801a6-trusted-ca-bundle\") pod \"console-f9d7485db-j8qrk\" (UID: \"49f645cb-7805-4ded-9f3e-d43bdb3801a6\") " pod="openshift-console/console-f9d7485db-j8qrk" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.943046 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2c9z\" (UniqueName: \"kubernetes.io/projected/7b429d7c-3700-4d9f-b6b2-554219223515-kube-api-access-j2c9z\") pod \"console-operator-58897d9998-djlmv\" (UID: \"7b429d7c-3700-4d9f-b6b2-554219223515\") " pod="openshift-console-operator/console-operator-58897d9998-djlmv" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.943102 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zm64d\" (UniqueName: \"kubernetes.io/projected/ce5ae64a-c99b-4700-aef6-23a77794a308-kube-api-access-zm64d\") pod \"machine-config-operator-74547568cd-2jl6z\" (UID: \"ce5ae64a-c99b-4700-aef6-23a77794a308\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2jl6z" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.944019 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/b695978e-b67c-4812-9083-22538cdd3045-mountpoint-dir\") pod \"csi-hostpathplugin-2pgbl\" (UID: \"b695978e-b67c-4812-9083-22538cdd3045\") " pod="hostpath-provisioner/csi-hostpathplugin-2pgbl" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.944062 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dd62f2fc-10ef-4a6e-80f1-0813f3a681bd-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-p5r46\" (UID: \"dd62f2fc-10ef-4a6e-80f1-0813f3a681bd\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-p5r46" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.944072 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b429d7c-3700-4d9f-b6b2-554219223515-serving-cert\") pod \"console-operator-58897d9998-djlmv\" (UID: \"7b429d7c-3700-4d9f-b6b2-554219223515\") " pod="openshift-console-operator/console-operator-58897d9998-djlmv" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.944087 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b695978e-b67c-4812-9083-22538cdd3045-socket-dir\") pod \"csi-hostpathplugin-2pgbl\" (UID: \"b695978e-b67c-4812-9083-22538cdd3045\") " pod="hostpath-provisioner/csi-hostpathplugin-2pgbl" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.943625 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dd62f2fc-10ef-4a6e-80f1-0813f3a681bd-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-p5r46\" (UID: \"dd62f2fc-10ef-4a6e-80f1-0813f3a681bd\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-p5r46" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.944175 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b13d8427-dcef-4925-92b6-0e6bf1aca8c8-service-ca-bundle\") pod \"router-default-5444994796-vds97\" (UID: \"b13d8427-dcef-4925-92b6-0e6bf1aca8c8\") " pod="openshift-ingress/router-default-5444994796-vds97" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.943477 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2663314f-3e35-4ab4-b9b9-28e829cde5de-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-2ckzq\" (UID: \"2663314f-3e35-4ab4-b9b9-28e829cde5de\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2ckzq" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.944394 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb0bd79d-5459-4b35-adb8-eca9d0fb069c-service-ca-bundle\") pod \"authentication-operator-69f744f599-cd44l\" (UID: \"bb0bd79d-5459-4b35-adb8-eca9d0fb069c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cd44l" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.944435 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ca5cb48c-43cc-428f-bf99-ba396d595e5c-audit-policies\") pod \"apiserver-7bbb656c7d-tb9dl\" (UID: \"ca5cb48c-43cc-428f-bf99-ba396d595e5c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tb9dl" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.944471 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmxjl\" (UniqueName: \"kubernetes.io/projected/eb165aaf-36a6-4965-bebf-6a40e1695b94-kube-api-access-qmxjl\") pod \"downloads-7954f5f757-clz2m\" (UID: \"eb165aaf-36a6-4965-bebf-6a40e1695b94\") " pod="openshift-console/downloads-7954f5f757-clz2m" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.944503 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kptqv\" (UniqueName: \"kubernetes.io/projected/119e0da8-6d9c-48cc-ab21-85a78ca95c5c-kube-api-access-kptqv\") pod \"ingress-canary-k7m7b\" (UID: \"119e0da8-6d9c-48cc-ab21-85a78ca95c5c\") " pod="openshift-ingress-canary/ingress-canary-k7m7b" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.944528 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/236033b2-63e5-43e2-a9b7-3549f0802c30-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-dk5lq\" (UID: \"236033b2-63e5-43e2-a9b7-3549f0802c30\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dk5lq" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.944551 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/b695978e-b67c-4812-9083-22538cdd3045-plugins-dir\") pod \"csi-hostpathplugin-2pgbl\" (UID: \"b695978e-b67c-4812-9083-22538cdd3045\") " pod="hostpath-provisioner/csi-hostpathplugin-2pgbl" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.944576 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/9e7908ff-df99-4827-8a79-ce0b7f0f5d80-etcd-ca\") pod \"etcd-operator-b45778765-97584\" (UID: \"9e7908ff-df99-4827-8a79-ce0b7f0f5d80\") " pod="openshift-etcd-operator/etcd-operator-b45778765-97584" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.944604 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xtkn\" (UniqueName: \"kubernetes.io/projected/d841e9a0-5181-40b8-9374-daa38341c4ff-kube-api-access-6xtkn\") pod \"dns-operator-744455d44c-52mpk\" (UID: \"d841e9a0-5181-40b8-9374-daa38341c4ff\") " pod="openshift-dns-operator/dns-operator-744455d44c-52mpk" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.944629 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/9e7908ff-df99-4827-8a79-ce0b7f0f5d80-etcd-service-ca\") pod \"etcd-operator-b45778765-97584\" (UID: \"9e7908ff-df99-4827-8a79-ce0b7f0f5d80\") " pod="openshift-etcd-operator/etcd-operator-b45778765-97584" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.944655 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfjj7\" (UniqueName: \"kubernetes.io/projected/236033b2-63e5-43e2-a9b7-3549f0802c30-kube-api-access-dfjj7\") pod \"kube-storage-version-migrator-operator-b67b599dd-dk5lq\" (UID: \"236033b2-63e5-43e2-a9b7-3549f0802c30\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dk5lq" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.944681 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6a66c652-318d-4fc7-9bff-a73d65b12966-metrics-tls\") pod \"dns-default-hncrq\" (UID: \"6a66c652-318d-4fc7-9bff-a73d65b12966\") " pod="openshift-dns/dns-default-hncrq" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.944709 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ca5cb48c-43cc-428f-bf99-ba396d595e5c-etcd-client\") pod \"apiserver-7bbb656c7d-tb9dl\" (UID: \"ca5cb48c-43cc-428f-bf99-ba396d595e5c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tb9dl" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.944735 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5a823511-d878-4e6d-acda-4202e00e3aab-registry-certificates\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.944756 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b13d8427-dcef-4925-92b6-0e6bf1aca8c8-metrics-certs\") pod \"router-default-5444994796-vds97\" (UID: \"b13d8427-dcef-4925-92b6-0e6bf1aca8c8\") " pod="openshift-ingress/router-default-5444994796-vds97" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.944778 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5a3e73bc-7747-47d5-958b-1adc974f20a9-proxy-tls\") pod \"machine-config-controller-84d6567774-l8n7r\" (UID: \"5a3e73bc-7747-47d5-958b-1adc974f20a9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l8n7r" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.944806 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7c21a705-7e47-4418-803a-41a459acef90-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-6tr2m\" (UID: \"7c21a705-7e47-4418-803a-41a459acef90\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6tr2m" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.944830 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2efbba77-f5bd-48cb-a790-f4c3564acb75-metrics-tls\") pod \"ingress-operator-5b745b69d9-jjffn\" (UID: \"2efbba77-f5bd-48cb-a790-f4c3564acb75\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jjffn" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.944880 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/b7a27ba4-0f5f-4ad6-9883-8698fd160802-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-m9xxt\" (UID: \"b7a27ba4-0f5f-4ad6-9883-8698fd160802\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-m9xxt" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.944904 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxwm4\" (UniqueName: \"kubernetes.io/projected/081c0a2d-7b34-487c-b965-e29e9231ef7b-kube-api-access-qxwm4\") pod \"migrator-59844c95c7-5bwhq\" (UID: \"081c0a2d-7b34-487c-b965-e29e9231ef7b\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5bwhq" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.944928 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b7a27ba4-0f5f-4ad6-9883-8698fd160802-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-m9xxt\" (UID: \"b7a27ba4-0f5f-4ad6-9883-8698fd160802\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-m9xxt" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.944958 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ca5cb48c-43cc-428f-bf99-ba396d595e5c-encryption-config\") pod \"apiserver-7bbb656c7d-tb9dl\" (UID: \"ca5cb48c-43cc-428f-bf99-ba396d595e5c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tb9dl" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.944980 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/e3945afb-20d9-4312-b9bb-dbe5bc788cda-signing-cabundle\") pod \"service-ca-9c57cc56f-qflxg\" (UID: \"e3945afb-20d9-4312-b9bb-dbe5bc788cda\") " pod="openshift-service-ca/service-ca-9c57cc56f-qflxg" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.945010 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/49f645cb-7805-4ded-9f3e-d43bdb3801a6-console-config\") pod \"console-f9d7485db-j8qrk\" (UID: \"49f645cb-7805-4ded-9f3e-d43bdb3801a6\") " pod="openshift-console/console-f9d7485db-j8qrk" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.945033 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b0b74168-914c-4a2e-9122-c55d3bc3bcc2-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-pcwhp\" (UID: \"b0b74168-914c-4a2e-9122-c55d3bc3bcc2\") " pod="openshift-marketplace/marketplace-operator-79b997595-pcwhp" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.945060 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/236033b2-63e5-43e2-a9b7-3549f0802c30-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-dk5lq\" (UID: \"236033b2-63e5-43e2-a9b7-3549f0802c30\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dk5lq" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.945087 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ce5ae64a-c99b-4700-aef6-23a77794a308-auth-proxy-config\") pod \"machine-config-operator-74547568cd-2jl6z\" (UID: \"ce5ae64a-c99b-4700-aef6-23a77794a308\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2jl6z" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.945114 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqdgc\" (UniqueName: \"kubernetes.io/projected/5a823511-d878-4e6d-acda-4202e00e3aab-kube-api-access-tqdgc\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.945138 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/49f645cb-7805-4ded-9f3e-d43bdb3801a6-console-oauth-config\") pod \"console-f9d7485db-j8qrk\" (UID: \"49f645cb-7805-4ded-9f3e-d43bdb3801a6\") " pod="openshift-console/console-f9d7485db-j8qrk" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.945160 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca5cb48c-43cc-428f-bf99-ba396d595e5c-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-tb9dl\" (UID: \"ca5cb48c-43cc-428f-bf99-ba396d595e5c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tb9dl" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.945190 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2k542\" (UniqueName: \"kubernetes.io/projected/bb0bd79d-5459-4b35-adb8-eca9d0fb069c-kube-api-access-2k542\") pod \"authentication-operator-69f744f599-cd44l\" (UID: \"bb0bd79d-5459-4b35-adb8-eca9d0fb069c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cd44l" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.945198 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb0bd79d-5459-4b35-adb8-eca9d0fb069c-service-ca-bundle\") pod \"authentication-operator-69f744f599-cd44l\" (UID: \"bb0bd79d-5459-4b35-adb8-eca9d0fb069c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cd44l" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.945217 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b0b74168-914c-4a2e-9122-c55d3bc3bcc2-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-pcwhp\" (UID: \"b0b74168-914c-4a2e-9122-c55d3bc3bcc2\") " pod="openshift-marketplace/marketplace-operator-79b997595-pcwhp" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.945246 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksnq6\" (UniqueName: \"kubernetes.io/projected/b13d8427-dcef-4925-92b6-0e6bf1aca8c8-kube-api-access-ksnq6\") pod \"router-default-5444994796-vds97\" (UID: \"b13d8427-dcef-4925-92b6-0e6bf1aca8c8\") " pod="openshift-ingress/router-default-5444994796-vds97" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.945287 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rvf2\" (UniqueName: \"kubernetes.io/projected/b74d4447-5cdf-482d-bc97-81cc3f6f5f1c-kube-api-access-9rvf2\") pod \"machine-config-server-qsvzc\" (UID: \"b74d4447-5cdf-482d-bc97-81cc3f6f5f1c\") " pod="openshift-machine-config-operator/machine-config-server-qsvzc" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.945319 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af1a70ec-af98-44c7-a59c-2d5a6d5c1200-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-tg9t2\" (UID: \"af1a70ec-af98-44c7-a59c-2d5a6d5c1200\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-tg9t2" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.945324 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9e7908ff-df99-4827-8a79-ce0b7f0f5d80-serving-cert\") pod \"etcd-operator-b45778765-97584\" (UID: \"9e7908ff-df99-4827-8a79-ce0b7f0f5d80\") " pod="openshift-etcd-operator/etcd-operator-b45778765-97584" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.945343 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0306bf0e-f0c1-4e47-b63e-909b979c5844-secret-volume\") pod \"collect-profiles-29567745-czvxj\" (UID: \"0306bf0e-f0c1-4e47-b63e-909b979c5844\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567745-czvxj" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.945353 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5a823511-d878-4e6d-acda-4202e00e3aab-registry-tls\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.945365 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/3a049943-f882-47d7-ac4e-703eebca8103-profile-collector-cert\") pod \"catalog-operator-68c6474976-fb258\" (UID: \"3a049943-f882-47d7-ac4e-703eebca8103\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fb258" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.945217 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ca5cb48c-43cc-428f-bf99-ba396d595e5c-audit-policies\") pod \"apiserver-7bbb656c7d-tb9dl\" (UID: \"ca5cb48c-43cc-428f-bf99-ba396d595e5c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tb9dl" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.945520 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/3f819915-64a0-4327-ac01-5ff842cbc592-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-646sv\" (UID: \"3f819915-64a0-4327-ac01-5ff842cbc592\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-646sv" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.945554 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2s67x\" (UniqueName: \"kubernetes.io/projected/9e7908ff-df99-4827-8a79-ce0b7f0f5d80-kube-api-access-2s67x\") pod \"etcd-operator-b45778765-97584\" (UID: \"9e7908ff-df99-4827-8a79-ce0b7f0f5d80\") " pod="openshift-etcd-operator/etcd-operator-b45778765-97584" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.945584 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzbh5\" (UniqueName: \"kubernetes.io/projected/4cc9f216-4352-4bff-a0eb-48659e3a603d-kube-api-access-tzbh5\") pod \"service-ca-operator-777779d784-vktqs\" (UID: \"4cc9f216-4352-4bff-a0eb-48659e3a603d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vktqs" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.945610 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mv8pp\" (UniqueName: \"kubernetes.io/projected/0306bf0e-f0c1-4e47-b63e-909b979c5844-kube-api-access-mv8pp\") pod \"collect-profiles-29567745-czvxj\" (UID: \"0306bf0e-f0c1-4e47-b63e-909b979c5844\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567745-czvxj" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.945646 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/49f645cb-7805-4ded-9f3e-d43bdb3801a6-oauth-serving-cert\") pod \"console-f9d7485db-j8qrk\" (UID: \"49f645cb-7805-4ded-9f3e-d43bdb3801a6\") " pod="openshift-console/console-f9d7485db-j8qrk" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.945672 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c21a705-7e47-4418-803a-41a459acef90-config\") pod \"kube-apiserver-operator-766d6c64bb-6tr2m\" (UID: \"7c21a705-7e47-4418-803a-41a459acef90\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6tr2m" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.945871 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/8f518bb5-0336-44df-ac60-174c8426974e-available-featuregates\") pod \"openshift-config-operator-7777fb866f-2ptjk\" (UID: \"8f518bb5-0336-44df-ac60-174c8426974e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2ptjk" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.945898 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2663314f-3e35-4ab4-b9b9-28e829cde5de-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-2ckzq\" (UID: \"2663314f-3e35-4ab4-b9b9-28e829cde5de\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2ckzq" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.945920 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5a823511-d878-4e6d-acda-4202e00e3aab-trusted-ca\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.945943 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af1a70ec-af98-44c7-a59c-2d5a6d5c1200-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-tg9t2\" (UID: \"af1a70ec-af98-44c7-a59c-2d5a6d5c1200\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-tg9t2" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.945966 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb0bd79d-5459-4b35-adb8-eca9d0fb069c-config\") pod \"authentication-operator-69f744f599-cd44l\" (UID: \"bb0bd79d-5459-4b35-adb8-eca9d0fb069c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cd44l" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.945990 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzkd6\" (UniqueName: \"kubernetes.io/projected/ea531307-d9b1-4798-bac8-d34094d27e2c-kube-api-access-bzkd6\") pod \"olm-operator-6b444d44fb-mlpft\" (UID: \"ea531307-d9b1-4798-bac8-d34094d27e2c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mlpft" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.946011 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/af1a70ec-af98-44c7-a59c-2d5a6d5c1200-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-tg9t2\" (UID: \"af1a70ec-af98-44c7-a59c-2d5a6d5c1200\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-tg9t2" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.946034 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d841e9a0-5181-40b8-9374-daa38341c4ff-metrics-tls\") pod \"dns-operator-744455d44c-52mpk\" (UID: \"d841e9a0-5181-40b8-9374-daa38341c4ff\") " pod="openshift-dns-operator/dns-operator-744455d44c-52mpk" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.946055 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ca5cb48c-43cc-428f-bf99-ba396d595e5c-audit-dir\") pod \"apiserver-7bbb656c7d-tb9dl\" (UID: \"ca5cb48c-43cc-428f-bf99-ba396d595e5c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tb9dl" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.946077 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f17fa271-26a6-4620-930b-f30d50f3412b-webhook-cert\") pod \"packageserver-d55dfcdfc-skpjj\" (UID: \"f17fa271-26a6-4620-930b-f30d50f3412b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-skpjj" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.946104 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/c2d03fe9-f23e-4f3d-a281-b80ba7fa7f38-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-n8ttn\" (UID: \"c2d03fe9-f23e-4f3d-a281-b80ba7fa7f38\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-n8ttn" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.946128 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/b74d4447-5cdf-482d-bc97-81cc3f6f5f1c-node-bootstrap-token\") pod \"machine-config-server-qsvzc\" (UID: \"b74d4447-5cdf-482d-bc97-81cc3f6f5f1c\") " pod="openshift-machine-config-operator/machine-config-server-qsvzc" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.946164 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkt2m\" (UniqueName: \"kubernetes.io/projected/b0b74168-914c-4a2e-9122-c55d3bc3bcc2-kube-api-access-hkt2m\") pod \"marketplace-operator-79b997595-pcwhp\" (UID: \"b0b74168-914c-4a2e-9122-c55d3bc3bcc2\") " pod="openshift-marketplace/marketplace-operator-79b997595-pcwhp" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.946184 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b695978e-b67c-4812-9083-22538cdd3045-registration-dir\") pod \"csi-hostpathplugin-2pgbl\" (UID: \"b695978e-b67c-4812-9083-22538cdd3045\") " pod="hostpath-provisioner/csi-hostpathplugin-2pgbl" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.946209 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvp2w\" (UniqueName: \"kubernetes.io/projected/8f518bb5-0336-44df-ac60-174c8426974e-kube-api-access-xvp2w\") pod \"openshift-config-operator-7777fb866f-2ptjk\" (UID: \"8f518bb5-0336-44df-ac60-174c8426974e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2ptjk" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.946231 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k45qr\" (UniqueName: \"kubernetes.io/projected/6a66c652-318d-4fc7-9bff-a73d65b12966-kube-api-access-k45qr\") pod \"dns-default-hncrq\" (UID: \"6a66c652-318d-4fc7-9bff-a73d65b12966\") " pod="openshift-dns/dns-default-hncrq" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.946254 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/9e7908ff-df99-4827-8a79-ce0b7f0f5d80-etcd-ca\") pod \"etcd-operator-b45778765-97584\" (UID: \"9e7908ff-df99-4827-8a79-ce0b7f0f5d80\") " pod="openshift-etcd-operator/etcd-operator-b45778765-97584" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.946369 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5a823511-d878-4e6d-acda-4202e00e3aab-installation-pull-secrets\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.946391 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5a823511-d878-4e6d-acda-4202e00e3aab-ca-trust-extracted\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.946413 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frlpd\" (UniqueName: \"kubernetes.io/projected/844b8e4f-7ba3-4a52-b6a7-0e8ea0d78003-kube-api-access-frlpd\") pod \"cluster-samples-operator-665b6dd947-2tvkz\" (UID: \"844b8e4f-7ba3-4a52-b6a7-0e8ea0d78003\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2tvkz" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.946434 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgn8b\" (UniqueName: \"kubernetes.io/projected/ca5cb48c-43cc-428f-bf99-ba396d595e5c-kube-api-access-rgn8b\") pod \"apiserver-7bbb656c7d-tb9dl\" (UID: \"ca5cb48c-43cc-428f-bf99-ba396d595e5c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tb9dl" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.946458 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6g665\" (UniqueName: \"kubernetes.io/projected/3f819915-64a0-4327-ac01-5ff842cbc592-kube-api-access-6g665\") pod \"control-plane-machine-set-operator-78cbb6b69f-646sv\" (UID: \"3f819915-64a0-4327-ac01-5ff842cbc592\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-646sv" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.946484 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb0bd79d-5459-4b35-adb8-eca9d0fb069c-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-cd44l\" (UID: \"bb0bd79d-5459-4b35-adb8-eca9d0fb069c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cd44l" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.946506 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6a66c652-318d-4fc7-9bff-a73d65b12966-config-volume\") pod \"dns-default-hncrq\" (UID: \"6a66c652-318d-4fc7-9bff-a73d65b12966\") " pod="openshift-dns/dns-default-hncrq" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.946530 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czl5l\" (UniqueName: \"kubernetes.io/projected/49f645cb-7805-4ded-9f3e-d43bdb3801a6-kube-api-access-czl5l\") pod \"console-f9d7485db-j8qrk\" (UID: \"49f645cb-7805-4ded-9f3e-d43bdb3801a6\") " pod="openshift-console/console-f9d7485db-j8qrk" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.946552 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ce5ae64a-c99b-4700-aef6-23a77794a308-proxy-tls\") pod \"machine-config-operator-74547568cd-2jl6z\" (UID: \"ce5ae64a-c99b-4700-aef6-23a77794a308\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2jl6z" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.946575 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kg48m\" (UniqueName: \"kubernetes.io/projected/5a3e73bc-7747-47d5-958b-1adc974f20a9-kube-api-access-kg48m\") pod \"machine-config-controller-84d6567774-l8n7r\" (UID: \"5a3e73bc-7747-47d5-958b-1adc974f20a9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l8n7r" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.946601 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2efbba77-f5bd-48cb-a790-f4c3564acb75-bound-sa-token\") pod \"ingress-operator-5b745b69d9-jjffn\" (UID: \"2efbba77-f5bd-48cb-a790-f4c3564acb75\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jjffn" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.946623 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/119e0da8-6d9c-48cc-ab21-85a78ca95c5c-cert\") pod \"ingress-canary-k7m7b\" (UID: \"119e0da8-6d9c-48cc-ab21-85a78ca95c5c\") " pod="openshift-ingress-canary/ingress-canary-k7m7b" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.946646 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd62f2fc-10ef-4a6e-80f1-0813f3a681bd-config\") pod \"kube-controller-manager-operator-78b949d7b-p5r46\" (UID: \"dd62f2fc-10ef-4a6e-80f1-0813f3a681bd\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-p5r46" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.946671 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ca5cb48c-43cc-428f-bf99-ba396d595e5c-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-tb9dl\" (UID: \"ca5cb48c-43cc-428f-bf99-ba396d595e5c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tb9dl" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.946693 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7b429d7c-3700-4d9f-b6b2-554219223515-trusted-ca\") pod \"console-operator-58897d9998-djlmv\" (UID: \"7b429d7c-3700-4d9f-b6b2-554219223515\") " pod="openshift-console-operator/console-operator-58897d9998-djlmv" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.946714 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/b695978e-b67c-4812-9083-22538cdd3045-csi-data-dir\") pod \"csi-hostpathplugin-2pgbl\" (UID: \"b695978e-b67c-4812-9083-22538cdd3045\") " pod="hostpath-provisioner/csi-hostpathplugin-2pgbl" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.946736 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/b74d4447-5cdf-482d-bc97-81cc3f6f5f1c-certs\") pod \"machine-config-server-qsvzc\" (UID: \"b74d4447-5cdf-482d-bc97-81cc3f6f5f1c\") " pod="openshift-machine-config-operator/machine-config-server-qsvzc" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.946921 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjpgn\" (UniqueName: \"kubernetes.io/projected/3a049943-f882-47d7-ac4e-703eebca8103-kube-api-access-mjpgn\") pod \"catalog-operator-68c6474976-fb258\" (UID: \"3a049943-f882-47d7-ac4e-703eebca8103\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fb258" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.946952 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f518bb5-0336-44df-ac60-174c8426974e-serving-cert\") pod \"openshift-config-operator-7777fb866f-2ptjk\" (UID: \"8f518bb5-0336-44df-ac60-174c8426974e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2ptjk" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.946986 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/b13d8427-dcef-4925-92b6-0e6bf1aca8c8-default-certificate\") pod \"router-default-5444994796-vds97\" (UID: \"b13d8427-dcef-4925-92b6-0e6bf1aca8c8\") " pod="openshift-ingress/router-default-5444994796-vds97" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.947011 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca5cb48c-43cc-428f-bf99-ba396d595e5c-serving-cert\") pod \"apiserver-7bbb656c7d-tb9dl\" (UID: \"ca5cb48c-43cc-428f-bf99-ba396d595e5c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tb9dl" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.947034 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5a3e73bc-7747-47d5-958b-1adc974f20a9-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-l8n7r\" (UID: \"5a3e73bc-7747-47d5-958b-1adc974f20a9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l8n7r" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.947057 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/49f645cb-7805-4ded-9f3e-d43bdb3801a6-console-serving-cert\") pod \"console-f9d7485db-j8qrk\" (UID: \"49f645cb-7805-4ded-9f3e-d43bdb3801a6\") " pod="openshift-console/console-f9d7485db-j8qrk" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.947076 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/49f645cb-7805-4ded-9f3e-d43bdb3801a6-service-ca\") pod \"console-f9d7485db-j8qrk\" (UID: \"49f645cb-7805-4ded-9f3e-d43bdb3801a6\") " pod="openshift-console/console-f9d7485db-j8qrk" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.947098 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b429d7c-3700-4d9f-b6b2-554219223515-config\") pod \"console-operator-58897d9998-djlmv\" (UID: \"7b429d7c-3700-4d9f-b6b2-554219223515\") " pod="openshift-console-operator/console-operator-58897d9998-djlmv" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.947120 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f17fa271-26a6-4620-930b-f30d50f3412b-apiservice-cert\") pod \"packageserver-d55dfcdfc-skpjj\" (UID: \"f17fa271-26a6-4620-930b-f30d50f3412b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-skpjj" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.947144 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhvl9\" (UniqueName: \"kubernetes.io/projected/38f42eb2-0cf6-4c5b-8159-a89e22404a73-kube-api-access-zhvl9\") pod \"multus-admission-controller-857f4d67dd-n9mn4\" (UID: \"38f42eb2-0cf6-4c5b-8159-a89e22404a73\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-n9mn4" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.947220 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/8f518bb5-0336-44df-ac60-174c8426974e-available-featuregates\") pod \"openshift-config-operator-7777fb866f-2ptjk\" (UID: \"8f518bb5-0336-44df-ac60-174c8426974e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2ptjk" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.948273 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49f645cb-7805-4ded-9f3e-d43bdb3801a6-trusted-ca-bundle\") pod \"console-f9d7485db-j8qrk\" (UID: \"49f645cb-7805-4ded-9f3e-d43bdb3801a6\") " pod="openshift-console/console-f9d7485db-j8qrk" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.948791 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ca5cb48c-43cc-428f-bf99-ba396d595e5c-encryption-config\") pod \"apiserver-7bbb656c7d-tb9dl\" (UID: \"ca5cb48c-43cc-428f-bf99-ba396d595e5c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tb9dl" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.948946 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/844b8e4f-7ba3-4a52-b6a7-0e8ea0d78003-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-2tvkz\" (UID: \"844b8e4f-7ba3-4a52-b6a7-0e8ea0d78003\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2tvkz" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.949015 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/b13d8427-dcef-4925-92b6-0e6bf1aca8c8-stats-auth\") pod \"router-default-5444994796-vds97\" (UID: \"b13d8427-dcef-4925-92b6-0e6bf1aca8c8\") " pod="openshift-ingress/router-default-5444994796-vds97" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.949176 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb0bd79d-5459-4b35-adb8-eca9d0fb069c-config\") pod \"authentication-operator-69f744f599-cd44l\" (UID: \"bb0bd79d-5459-4b35-adb8-eca9d0fb069c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cd44l" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.949473 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/9e7908ff-df99-4827-8a79-ce0b7f0f5d80-etcd-service-ca\") pod \"etcd-operator-b45778765-97584\" (UID: \"9e7908ff-df99-4827-8a79-ce0b7f0f5d80\") " pod="openshift-etcd-operator/etcd-operator-b45778765-97584" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.949536 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5a823511-d878-4e6d-acda-4202e00e3aab-trusted-ca\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.949738 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/49f645cb-7805-4ded-9f3e-d43bdb3801a6-oauth-serving-cert\") pod \"console-f9d7485db-j8qrk\" (UID: \"49f645cb-7805-4ded-9f3e-d43bdb3801a6\") " pod="openshift-console/console-f9d7485db-j8qrk" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.949770 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ca5cb48c-43cc-428f-bf99-ba396d595e5c-audit-dir\") pod \"apiserver-7bbb656c7d-tb9dl\" (UID: \"ca5cb48c-43cc-428f-bf99-ba396d595e5c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tb9dl" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.949781 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/49f645cb-7805-4ded-9f3e-d43bdb3801a6-console-config\") pod \"console-f9d7485db-j8qrk\" (UID: \"49f645cb-7805-4ded-9f3e-d43bdb3801a6\") " pod="openshift-console/console-f9d7485db-j8qrk" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.950152 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd62f2fc-10ef-4a6e-80f1-0813f3a681bd-config\") pod \"kube-controller-manager-operator-78b949d7b-p5r46\" (UID: \"dd62f2fc-10ef-4a6e-80f1-0813f3a681bd\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-p5r46" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.950180 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca5cb48c-43cc-428f-bf99-ba396d595e5c-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-tb9dl\" (UID: \"ca5cb48c-43cc-428f-bf99-ba396d595e5c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tb9dl" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.950672 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/49f645cb-7805-4ded-9f3e-d43bdb3801a6-console-oauth-config\") pod \"console-f9d7485db-j8qrk\" (UID: \"49f645cb-7805-4ded-9f3e-d43bdb3801a6\") " pod="openshift-console/console-f9d7485db-j8qrk" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.950693 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5a823511-d878-4e6d-acda-4202e00e3aab-registry-certificates\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.951348 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ca5cb48c-43cc-428f-bf99-ba396d595e5c-etcd-client\") pod \"apiserver-7bbb656c7d-tb9dl\" (UID: \"ca5cb48c-43cc-428f-bf99-ba396d595e5c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tb9dl" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.951999 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2663314f-3e35-4ab4-b9b9-28e829cde5de-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-2ckzq\" (UID: \"2663314f-3e35-4ab4-b9b9-28e829cde5de\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2ckzq" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.952608 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b429d7c-3700-4d9f-b6b2-554219223515-config\") pod \"console-operator-58897d9998-djlmv\" (UID: \"7b429d7c-3700-4d9f-b6b2-554219223515\") " pod="openshift-console-operator/console-operator-58897d9998-djlmv" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.953005 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d841e9a0-5181-40b8-9374-daa38341c4ff-metrics-tls\") pod \"dns-operator-744455d44c-52mpk\" (UID: \"d841e9a0-5181-40b8-9374-daa38341c4ff\") " pod="openshift-dns-operator/dns-operator-744455d44c-52mpk" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.953361 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ca5cb48c-43cc-428f-bf99-ba396d595e5c-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-tb9dl\" (UID: \"ca5cb48c-43cc-428f-bf99-ba396d595e5c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tb9dl" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.953381 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2efbba77-f5bd-48cb-a790-f4c3564acb75-metrics-tls\") pod \"ingress-operator-5b745b69d9-jjffn\" (UID: \"2efbba77-f5bd-48cb-a790-f4c3564acb75\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jjffn" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.954358 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5a823511-d878-4e6d-acda-4202e00e3aab-ca-trust-extracted\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.954744 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5a823511-d878-4e6d-acda-4202e00e3aab-installation-pull-secrets\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.955003 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c21a705-7e47-4418-803a-41a459acef90-config\") pod \"kube-apiserver-operator-766d6c64bb-6tr2m\" (UID: \"7c21a705-7e47-4418-803a-41a459acef90\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6tr2m" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.955022 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca5cb48c-43cc-428f-bf99-ba396d595e5c-serving-cert\") pod \"apiserver-7bbb656c7d-tb9dl\" (UID: \"ca5cb48c-43cc-428f-bf99-ba396d595e5c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tb9dl" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.955159 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/49f645cb-7805-4ded-9f3e-d43bdb3801a6-service-ca\") pod \"console-f9d7485db-j8qrk\" (UID: \"49f645cb-7805-4ded-9f3e-d43bdb3801a6\") " pod="openshift-console/console-f9d7485db-j8qrk" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.955822 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7c21a705-7e47-4418-803a-41a459acef90-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-6tr2m\" (UID: \"7c21a705-7e47-4418-803a-41a459acef90\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6tr2m" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.955883 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb0bd79d-5459-4b35-adb8-eca9d0fb069c-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-cd44l\" (UID: \"bb0bd79d-5459-4b35-adb8-eca9d0fb069c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cd44l" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.956353 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb0bd79d-5459-4b35-adb8-eca9d0fb069c-serving-cert\") pod \"authentication-operator-69f744f599-cd44l\" (UID: \"bb0bd79d-5459-4b35-adb8-eca9d0fb069c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cd44l" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.956397 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9e7908ff-df99-4827-8a79-ce0b7f0f5d80-etcd-client\") pod \"etcd-operator-b45778765-97584\" (UID: \"9e7908ff-df99-4827-8a79-ce0b7f0f5d80\") " pod="openshift-etcd-operator/etcd-operator-b45778765-97584" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.957019 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7b429d7c-3700-4d9f-b6b2-554219223515-trusted-ca\") pod \"console-operator-58897d9998-djlmv\" (UID: \"7b429d7c-3700-4d9f-b6b2-554219223515\") " pod="openshift-console-operator/console-operator-58897d9998-djlmv" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.957156 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/49f645cb-7805-4ded-9f3e-d43bdb3801a6-console-serving-cert\") pod \"console-f9d7485db-j8qrk\" (UID: \"49f645cb-7805-4ded-9f3e-d43bdb3801a6\") " pod="openshift-console/console-f9d7485db-j8qrk" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.957229 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f518bb5-0336-44df-ac60-174c8426974e-serving-cert\") pod \"openshift-config-operator-7777fb866f-2ptjk\" (UID: \"8f518bb5-0336-44df-ac60-174c8426974e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2ptjk" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.958259 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/b13d8427-dcef-4925-92b6-0e6bf1aca8c8-default-certificate\") pod \"router-default-5444994796-vds97\" (UID: \"b13d8427-dcef-4925-92b6-0e6bf1aca8c8\") " pod="openshift-ingress/router-default-5444994796-vds97" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.960443 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b13d8427-dcef-4925-92b6-0e6bf1aca8c8-metrics-certs\") pod \"router-default-5444994796-vds97\" (UID: \"b13d8427-dcef-4925-92b6-0e6bf1aca8c8\") " pod="openshift-ingress/router-default-5444994796-vds97" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.972091 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kv44\" (UniqueName: \"kubernetes.io/projected/2efbba77-f5bd-48cb-a790-f4c3564acb75-kube-api-access-9kv44\") pod \"ingress-operator-5b745b69d9-jjffn\" (UID: \"2efbba77-f5bd-48cb-a790-f4c3564acb75\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jjffn" Mar 21 03:49:45 crc kubenswrapper[4685]: I0321 03:49:45.993157 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtxr4\" (UniqueName: \"kubernetes.io/projected/2663314f-3e35-4ab4-b9b9-28e829cde5de-kube-api-access-dtxr4\") pod \"openshift-controller-manager-operator-756b6f6bc6-2ckzq\" (UID: \"2663314f-3e35-4ab4-b9b9-28e829cde5de\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2ckzq" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.014874 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7c21a705-7e47-4418-803a-41a459acef90-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-6tr2m\" (UID: \"7c21a705-7e47-4418-803a-41a459acef90\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6tr2m" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.037621 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5a823511-d878-4e6d-acda-4202e00e3aab-bound-sa-token\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.047989 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 03:49:46 crc kubenswrapper[4685]: E0321 03:49:46.048182 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 03:49:46.548130408 +0000 UTC m=+219.025199200 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.048232 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5a3e73bc-7747-47d5-958b-1adc974f20a9-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-l8n7r\" (UID: \"5a3e73bc-7747-47d5-958b-1adc974f20a9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l8n7r" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.048269 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f17fa271-26a6-4620-930b-f30d50f3412b-apiservice-cert\") pod \"packageserver-d55dfcdfc-skpjj\" (UID: \"f17fa271-26a6-4620-930b-f30d50f3412b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-skpjj" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.048287 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhvl9\" (UniqueName: \"kubernetes.io/projected/38f42eb2-0cf6-4c5b-8159-a89e22404a73-kube-api-access-zhvl9\") pod \"multus-admission-controller-857f4d67dd-n9mn4\" (UID: \"38f42eb2-0cf6-4c5b-8159-a89e22404a73\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-n9mn4" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.048312 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/38f42eb2-0cf6-4c5b-8159-a89e22404a73-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-n9mn4\" (UID: \"38f42eb2-0cf6-4c5b-8159-a89e22404a73\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-n9mn4" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.048331 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmb7h\" (UniqueName: \"kubernetes.io/projected/b7a27ba4-0f5f-4ad6-9883-8698fd160802-kube-api-access-gmb7h\") pod \"cluster-image-registry-operator-dc59b4c8b-m9xxt\" (UID: \"b7a27ba4-0f5f-4ad6-9883-8698fd160802\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-m9xxt" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.048374 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9ffg\" (UniqueName: \"kubernetes.io/projected/f17fa271-26a6-4620-930b-f30d50f3412b-kube-api-access-q9ffg\") pod \"packageserver-d55dfcdfc-skpjj\" (UID: \"f17fa271-26a6-4620-930b-f30d50f3412b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-skpjj" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.048399 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4cc9f216-4352-4bff-a0eb-48659e3a603d-serving-cert\") pod \"service-ca-operator-777779d784-vktqs\" (UID: \"4cc9f216-4352-4bff-a0eb-48659e3a603d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vktqs" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.048422 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fr6x\" (UniqueName: \"kubernetes.io/projected/b695978e-b67c-4812-9083-22538cdd3045-kube-api-access-2fr6x\") pod \"csi-hostpathplugin-2pgbl\" (UID: \"b695978e-b67c-4812-9083-22538cdd3045\") " pod="hostpath-provisioner/csi-hostpathplugin-2pgbl" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.049374 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/e3945afb-20d9-4312-b9bb-dbe5bc788cda-signing-key\") pod \"service-ca-9c57cc56f-qflxg\" (UID: \"e3945afb-20d9-4312-b9bb-dbe5bc788cda\") " pod="openshift-service-ca/service-ca-9c57cc56f-qflxg" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.049400 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.049417 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/f17fa271-26a6-4620-930b-f30d50f3412b-tmpfs\") pod \"packageserver-d55dfcdfc-skpjj\" (UID: \"f17fa271-26a6-4620-930b-f30d50f3412b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-skpjj" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.049432 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0306bf0e-f0c1-4e47-b63e-909b979c5844-config-volume\") pod \"collect-profiles-29567745-czvxj\" (UID: \"0306bf0e-f0c1-4e47-b63e-909b979c5844\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567745-czvxj" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.049449 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b7a27ba4-0f5f-4ad6-9883-8698fd160802-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-m9xxt\" (UID: \"b7a27ba4-0f5f-4ad6-9883-8698fd160802\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-m9xxt" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.049464 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ea531307-d9b1-4798-bac8-d34094d27e2c-srv-cert\") pod \"olm-operator-6b444d44fb-mlpft\" (UID: \"ea531307-d9b1-4798-bac8-d34094d27e2c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mlpft" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.049796 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5a3e73bc-7747-47d5-958b-1adc974f20a9-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-l8n7r\" (UID: \"5a3e73bc-7747-47d5-958b-1adc974f20a9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l8n7r" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.050048 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/f17fa271-26a6-4620-930b-f30d50f3412b-tmpfs\") pod \"packageserver-d55dfcdfc-skpjj\" (UID: \"f17fa271-26a6-4620-930b-f30d50f3412b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-skpjj" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.050448 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0306bf0e-f0c1-4e47-b63e-909b979c5844-config-volume\") pod \"collect-profiles-29567745-czvxj\" (UID: \"0306bf0e-f0c1-4e47-b63e-909b979c5844\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567745-czvxj" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.050705 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgshs\" (UniqueName: \"kubernetes.io/projected/4ede3f08-f29b-4cb9-a96f-1c66239498f6-kube-api-access-kgshs\") pod \"auto-csr-approver-29567748-zv7h8\" (UID: \"4ede3f08-f29b-4cb9-a96f-1c66239498f6\") " pod="openshift-infra/auto-csr-approver-29567748-zv7h8" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.050749 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cc9f216-4352-4bff-a0eb-48659e3a603d-config\") pod \"service-ca-operator-777779d784-vktqs\" (UID: \"4cc9f216-4352-4bff-a0eb-48659e3a603d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vktqs" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.050785 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ea531307-d9b1-4798-bac8-d34094d27e2c-profile-collector-cert\") pod \"olm-operator-6b444d44fb-mlpft\" (UID: \"ea531307-d9b1-4798-bac8-d34094d27e2c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mlpft" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.050824 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7lvv\" (UniqueName: \"kubernetes.io/projected/c2d03fe9-f23e-4f3d-a281-b80ba7fa7f38-kube-api-access-w7lvv\") pod \"package-server-manager-789f6589d5-n8ttn\" (UID: \"c2d03fe9-f23e-4f3d-a281-b80ba7fa7f38\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-n8ttn" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.050881 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phckr\" (UniqueName: \"kubernetes.io/projected/e3945afb-20d9-4312-b9bb-dbe5bc788cda-kube-api-access-phckr\") pod \"service-ca-9c57cc56f-qflxg\" (UID: \"e3945afb-20d9-4312-b9bb-dbe5bc788cda\") " pod="openshift-service-ca/service-ca-9c57cc56f-qflxg" Mar 21 03:49:46 crc kubenswrapper[4685]: E0321 03:49:46.050901 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 03:49:46.550890501 +0000 UTC m=+219.027959293 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pjvx8" (UID: "5a823511-d878-4e6d-acda-4202e00e3aab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.050923 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/3a049943-f882-47d7-ac4e-703eebca8103-srv-cert\") pod \"catalog-operator-68c6474976-fb258\" (UID: \"3a049943-f882-47d7-ac4e-703eebca8103\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fb258" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.050983 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ce5ae64a-c99b-4700-aef6-23a77794a308-images\") pod \"machine-config-operator-74547568cd-2jl6z\" (UID: \"ce5ae64a-c99b-4700-aef6-23a77794a308\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2jl6z" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.051154 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zm64d\" (UniqueName: \"kubernetes.io/projected/ce5ae64a-c99b-4700-aef6-23a77794a308-kube-api-access-zm64d\") pod \"machine-config-operator-74547568cd-2jl6z\" (UID: \"ce5ae64a-c99b-4700-aef6-23a77794a308\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2jl6z" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.051174 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/b695978e-b67c-4812-9083-22538cdd3045-mountpoint-dir\") pod \"csi-hostpathplugin-2pgbl\" (UID: \"b695978e-b67c-4812-9083-22538cdd3045\") " pod="hostpath-provisioner/csi-hostpathplugin-2pgbl" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.051196 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b695978e-b67c-4812-9083-22538cdd3045-socket-dir\") pod \"csi-hostpathplugin-2pgbl\" (UID: \"b695978e-b67c-4812-9083-22538cdd3045\") " pod="hostpath-provisioner/csi-hostpathplugin-2pgbl" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.051230 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kptqv\" (UniqueName: \"kubernetes.io/projected/119e0da8-6d9c-48cc-ab21-85a78ca95c5c-kube-api-access-kptqv\") pod \"ingress-canary-k7m7b\" (UID: \"119e0da8-6d9c-48cc-ab21-85a78ca95c5c\") " pod="openshift-ingress-canary/ingress-canary-k7m7b" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.051248 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/236033b2-63e5-43e2-a9b7-3549f0802c30-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-dk5lq\" (UID: \"236033b2-63e5-43e2-a9b7-3549f0802c30\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dk5lq" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.051263 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/b695978e-b67c-4812-9083-22538cdd3045-plugins-dir\") pod \"csi-hostpathplugin-2pgbl\" (UID: \"b695978e-b67c-4812-9083-22538cdd3045\") " pod="hostpath-provisioner/csi-hostpathplugin-2pgbl" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.051291 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfjj7\" (UniqueName: \"kubernetes.io/projected/236033b2-63e5-43e2-a9b7-3549f0802c30-kube-api-access-dfjj7\") pod \"kube-storage-version-migrator-operator-b67b599dd-dk5lq\" (UID: \"236033b2-63e5-43e2-a9b7-3549f0802c30\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dk5lq" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.051308 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6a66c652-318d-4fc7-9bff-a73d65b12966-metrics-tls\") pod \"dns-default-hncrq\" (UID: \"6a66c652-318d-4fc7-9bff-a73d65b12966\") " pod="openshift-dns/dns-default-hncrq" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.051329 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5a3e73bc-7747-47d5-958b-1adc974f20a9-proxy-tls\") pod \"machine-config-controller-84d6567774-l8n7r\" (UID: \"5a3e73bc-7747-47d5-958b-1adc974f20a9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l8n7r" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.051353 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/b7a27ba4-0f5f-4ad6-9883-8698fd160802-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-m9xxt\" (UID: \"b7a27ba4-0f5f-4ad6-9883-8698fd160802\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-m9xxt" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.051369 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxwm4\" (UniqueName: \"kubernetes.io/projected/081c0a2d-7b34-487c-b965-e29e9231ef7b-kube-api-access-qxwm4\") pod \"migrator-59844c95c7-5bwhq\" (UID: \"081c0a2d-7b34-487c-b965-e29e9231ef7b\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5bwhq" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.051385 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b7a27ba4-0f5f-4ad6-9883-8698fd160802-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-m9xxt\" (UID: \"b7a27ba4-0f5f-4ad6-9883-8698fd160802\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-m9xxt" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.051403 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/e3945afb-20d9-4312-b9bb-dbe5bc788cda-signing-cabundle\") pod \"service-ca-9c57cc56f-qflxg\" (UID: \"e3945afb-20d9-4312-b9bb-dbe5bc788cda\") " pod="openshift-service-ca/service-ca-9c57cc56f-qflxg" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.051423 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b0b74168-914c-4a2e-9122-c55d3bc3bcc2-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-pcwhp\" (UID: \"b0b74168-914c-4a2e-9122-c55d3bc3bcc2\") " pod="openshift-marketplace/marketplace-operator-79b997595-pcwhp" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.051447 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/236033b2-63e5-43e2-a9b7-3549f0802c30-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-dk5lq\" (UID: \"236033b2-63e5-43e2-a9b7-3549f0802c30\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dk5lq" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.051463 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ce5ae64a-c99b-4700-aef6-23a77794a308-auth-proxy-config\") pod \"machine-config-operator-74547568cd-2jl6z\" (UID: \"ce5ae64a-c99b-4700-aef6-23a77794a308\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2jl6z" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.051492 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b0b74168-914c-4a2e-9122-c55d3bc3bcc2-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-pcwhp\" (UID: \"b0b74168-914c-4a2e-9122-c55d3bc3bcc2\") " pod="openshift-marketplace/marketplace-operator-79b997595-pcwhp" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.051511 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/3f819915-64a0-4327-ac01-5ff842cbc592-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-646sv\" (UID: \"3f819915-64a0-4327-ac01-5ff842cbc592\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-646sv" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.051529 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rvf2\" (UniqueName: \"kubernetes.io/projected/b74d4447-5cdf-482d-bc97-81cc3f6f5f1c-kube-api-access-9rvf2\") pod \"machine-config-server-qsvzc\" (UID: \"b74d4447-5cdf-482d-bc97-81cc3f6f5f1c\") " pod="openshift-machine-config-operator/machine-config-server-qsvzc" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.051544 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af1a70ec-af98-44c7-a59c-2d5a6d5c1200-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-tg9t2\" (UID: \"af1a70ec-af98-44c7-a59c-2d5a6d5c1200\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-tg9t2" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.051559 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0306bf0e-f0c1-4e47-b63e-909b979c5844-secret-volume\") pod \"collect-profiles-29567745-czvxj\" (UID: \"0306bf0e-f0c1-4e47-b63e-909b979c5844\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567745-czvxj" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.051786 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/3a049943-f882-47d7-ac4e-703eebca8103-profile-collector-cert\") pod \"catalog-operator-68c6474976-fb258\" (UID: \"3a049943-f882-47d7-ac4e-703eebca8103\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fb258" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.051820 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzbh5\" (UniqueName: \"kubernetes.io/projected/4cc9f216-4352-4bff-a0eb-48659e3a603d-kube-api-access-tzbh5\") pod \"service-ca-operator-777779d784-vktqs\" (UID: \"4cc9f216-4352-4bff-a0eb-48659e3a603d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vktqs" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.051850 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mv8pp\" (UniqueName: \"kubernetes.io/projected/0306bf0e-f0c1-4e47-b63e-909b979c5844-kube-api-access-mv8pp\") pod \"collect-profiles-29567745-czvxj\" (UID: \"0306bf0e-f0c1-4e47-b63e-909b979c5844\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567745-czvxj" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.051880 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af1a70ec-af98-44c7-a59c-2d5a6d5c1200-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-tg9t2\" (UID: \"af1a70ec-af98-44c7-a59c-2d5a6d5c1200\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-tg9t2" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.051898 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzkd6\" (UniqueName: \"kubernetes.io/projected/ea531307-d9b1-4798-bac8-d34094d27e2c-kube-api-access-bzkd6\") pod \"olm-operator-6b444d44fb-mlpft\" (UID: \"ea531307-d9b1-4798-bac8-d34094d27e2c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mlpft" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.051998 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/af1a70ec-af98-44c7-a59c-2d5a6d5c1200-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-tg9t2\" (UID: \"af1a70ec-af98-44c7-a59c-2d5a6d5c1200\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-tg9t2" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.052106 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f17fa271-26a6-4620-930b-f30d50f3412b-webhook-cert\") pod \"packageserver-d55dfcdfc-skpjj\" (UID: \"f17fa271-26a6-4620-930b-f30d50f3412b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-skpjj" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.052132 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/c2d03fe9-f23e-4f3d-a281-b80ba7fa7f38-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-n8ttn\" (UID: \"c2d03fe9-f23e-4f3d-a281-b80ba7fa7f38\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-n8ttn" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.052151 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/b74d4447-5cdf-482d-bc97-81cc3f6f5f1c-node-bootstrap-token\") pod \"machine-config-server-qsvzc\" (UID: \"b74d4447-5cdf-482d-bc97-81cc3f6f5f1c\") " pod="openshift-machine-config-operator/machine-config-server-qsvzc" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.052183 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkt2m\" (UniqueName: \"kubernetes.io/projected/b0b74168-914c-4a2e-9122-c55d3bc3bcc2-kube-api-access-hkt2m\") pod \"marketplace-operator-79b997595-pcwhp\" (UID: \"b0b74168-914c-4a2e-9122-c55d3bc3bcc2\") " pod="openshift-marketplace/marketplace-operator-79b997595-pcwhp" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.052205 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b695978e-b67c-4812-9083-22538cdd3045-registration-dir\") pod \"csi-hostpathplugin-2pgbl\" (UID: \"b695978e-b67c-4812-9083-22538cdd3045\") " pod="hostpath-provisioner/csi-hostpathplugin-2pgbl" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.052231 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k45qr\" (UniqueName: \"kubernetes.io/projected/6a66c652-318d-4fc7-9bff-a73d65b12966-kube-api-access-k45qr\") pod \"dns-default-hncrq\" (UID: \"6a66c652-318d-4fc7-9bff-a73d65b12966\") " pod="openshift-dns/dns-default-hncrq" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.052233 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f17fa271-26a6-4620-930b-f30d50f3412b-apiservice-cert\") pod \"packageserver-d55dfcdfc-skpjj\" (UID: \"f17fa271-26a6-4620-930b-f30d50f3412b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-skpjj" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.052264 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6g665\" (UniqueName: \"kubernetes.io/projected/3f819915-64a0-4327-ac01-5ff842cbc592-kube-api-access-6g665\") pod \"control-plane-machine-set-operator-78cbb6b69f-646sv\" (UID: \"3f819915-64a0-4327-ac01-5ff842cbc592\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-646sv" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.052285 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6a66c652-318d-4fc7-9bff-a73d65b12966-config-volume\") pod \"dns-default-hncrq\" (UID: \"6a66c652-318d-4fc7-9bff-a73d65b12966\") " pod="openshift-dns/dns-default-hncrq" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.052313 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ce5ae64a-c99b-4700-aef6-23a77794a308-proxy-tls\") pod \"machine-config-operator-74547568cd-2jl6z\" (UID: \"ce5ae64a-c99b-4700-aef6-23a77794a308\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2jl6z" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.052334 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kg48m\" (UniqueName: \"kubernetes.io/projected/5a3e73bc-7747-47d5-958b-1adc974f20a9-kube-api-access-kg48m\") pod \"machine-config-controller-84d6567774-l8n7r\" (UID: \"5a3e73bc-7747-47d5-958b-1adc974f20a9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l8n7r" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.052342 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cc9f216-4352-4bff-a0eb-48659e3a603d-config\") pod \"service-ca-operator-777779d784-vktqs\" (UID: \"4cc9f216-4352-4bff-a0eb-48659e3a603d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vktqs" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.052387 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/119e0da8-6d9c-48cc-ab21-85a78ca95c5c-cert\") pod \"ingress-canary-k7m7b\" (UID: \"119e0da8-6d9c-48cc-ab21-85a78ca95c5c\") " pod="openshift-ingress-canary/ingress-canary-k7m7b" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.052407 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/b695978e-b67c-4812-9083-22538cdd3045-csi-data-dir\") pod \"csi-hostpathplugin-2pgbl\" (UID: \"b695978e-b67c-4812-9083-22538cdd3045\") " pod="hostpath-provisioner/csi-hostpathplugin-2pgbl" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.052425 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/b74d4447-5cdf-482d-bc97-81cc3f6f5f1c-certs\") pod \"machine-config-server-qsvzc\" (UID: \"b74d4447-5cdf-482d-bc97-81cc3f6f5f1c\") " pod="openshift-machine-config-operator/machine-config-server-qsvzc" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.052442 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjpgn\" (UniqueName: \"kubernetes.io/projected/3a049943-f882-47d7-ac4e-703eebca8103-kube-api-access-mjpgn\") pod \"catalog-operator-68c6474976-fb258\" (UID: \"3a049943-f882-47d7-ac4e-703eebca8103\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fb258" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.053866 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b7a27ba4-0f5f-4ad6-9883-8698fd160802-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-m9xxt\" (UID: \"b7a27ba4-0f5f-4ad6-9883-8698fd160802\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-m9xxt" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.054611 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/e3945afb-20d9-4312-b9bb-dbe5bc788cda-signing-key\") pod \"service-ca-9c57cc56f-qflxg\" (UID: \"e3945afb-20d9-4312-b9bb-dbe5bc788cda\") " pod="openshift-service-ca/service-ca-9c57cc56f-qflxg" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.054728 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/38f42eb2-0cf6-4c5b-8159-a89e22404a73-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-n9mn4\" (UID: \"38f42eb2-0cf6-4c5b-8159-a89e22404a73\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-n9mn4" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.054916 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ea531307-d9b1-4798-bac8-d34094d27e2c-srv-cert\") pod \"olm-operator-6b444d44fb-mlpft\" (UID: \"ea531307-d9b1-4798-bac8-d34094d27e2c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mlpft" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.054990 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b695978e-b67c-4812-9083-22538cdd3045-socket-dir\") pod \"csi-hostpathplugin-2pgbl\" (UID: \"b695978e-b67c-4812-9083-22538cdd3045\") " pod="hostpath-provisioner/csi-hostpathplugin-2pgbl" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.055317 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/b7a27ba4-0f5f-4ad6-9883-8698fd160802-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-m9xxt\" (UID: \"b7a27ba4-0f5f-4ad6-9883-8698fd160802\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-m9xxt" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.055706 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ea531307-d9b1-4798-bac8-d34094d27e2c-profile-collector-cert\") pod \"olm-operator-6b444d44fb-mlpft\" (UID: \"ea531307-d9b1-4798-bac8-d34094d27e2c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mlpft" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.056241 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6a66c652-318d-4fc7-9bff-a73d65b12966-config-volume\") pod \"dns-default-hncrq\" (UID: \"6a66c652-318d-4fc7-9bff-a73d65b12966\") " pod="openshift-dns/dns-default-hncrq" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.056269 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4cc9f216-4352-4bff-a0eb-48659e3a603d-serving-cert\") pod \"service-ca-operator-777779d784-vktqs\" (UID: \"4cc9f216-4352-4bff-a0eb-48659e3a603d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vktqs" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.056341 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/b695978e-b67c-4812-9083-22538cdd3045-csi-data-dir\") pod \"csi-hostpathplugin-2pgbl\" (UID: \"b695978e-b67c-4812-9083-22538cdd3045\") " pod="hostpath-provisioner/csi-hostpathplugin-2pgbl" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.056413 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/b695978e-b67c-4812-9083-22538cdd3045-plugins-dir\") pod \"csi-hostpathplugin-2pgbl\" (UID: \"b695978e-b67c-4812-9083-22538cdd3045\") " pod="hostpath-provisioner/csi-hostpathplugin-2pgbl" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.057104 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af1a70ec-af98-44c7-a59c-2d5a6d5c1200-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-tg9t2\" (UID: \"af1a70ec-af98-44c7-a59c-2d5a6d5c1200\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-tg9t2" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.057443 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/236033b2-63e5-43e2-a9b7-3549f0802c30-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-dk5lq\" (UID: \"236033b2-63e5-43e2-a9b7-3549f0802c30\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dk5lq" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.057778 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f17fa271-26a6-4620-930b-f30d50f3412b-webhook-cert\") pod \"packageserver-d55dfcdfc-skpjj\" (UID: \"f17fa271-26a6-4620-930b-f30d50f3412b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-skpjj" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.057956 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b695978e-b67c-4812-9083-22538cdd3045-registration-dir\") pod \"csi-hostpathplugin-2pgbl\" (UID: \"b695978e-b67c-4812-9083-22538cdd3045\") " pod="hostpath-provisioner/csi-hostpathplugin-2pgbl" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.058043 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2c9z\" (UniqueName: \"kubernetes.io/projected/7b429d7c-3700-4d9f-b6b2-554219223515-kube-api-access-j2c9z\") pod \"console-operator-58897d9998-djlmv\" (UID: \"7b429d7c-3700-4d9f-b6b2-554219223515\") " pod="openshift-console-operator/console-operator-58897d9998-djlmv" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.058396 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b0b74168-914c-4a2e-9122-c55d3bc3bcc2-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-pcwhp\" (UID: \"b0b74168-914c-4a2e-9122-c55d3bc3bcc2\") " pod="openshift-marketplace/marketplace-operator-79b997595-pcwhp" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.063546 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/b74d4447-5cdf-482d-bc97-81cc3f6f5f1c-certs\") pod \"machine-config-server-qsvzc\" (UID: \"b74d4447-5cdf-482d-bc97-81cc3f6f5f1c\") " pod="openshift-machine-config-operator/machine-config-server-qsvzc" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.063583 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ce5ae64a-c99b-4700-aef6-23a77794a308-auth-proxy-config\") pod \"machine-config-operator-74547568cd-2jl6z\" (UID: \"ce5ae64a-c99b-4700-aef6-23a77794a308\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2jl6z" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.066284 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/e3945afb-20d9-4312-b9bb-dbe5bc788cda-signing-cabundle\") pod \"service-ca-9c57cc56f-qflxg\" (UID: \"e3945afb-20d9-4312-b9bb-dbe5bc788cda\") " pod="openshift-service-ca/service-ca-9c57cc56f-qflxg" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.067145 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b0b74168-914c-4a2e-9122-c55d3bc3bcc2-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-pcwhp\" (UID: \"b0b74168-914c-4a2e-9122-c55d3bc3bcc2\") " pod="openshift-marketplace/marketplace-operator-79b997595-pcwhp" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.068947 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ce5ae64a-c99b-4700-aef6-23a77794a308-images\") pod \"machine-config-operator-74547568cd-2jl6z\" (UID: \"ce5ae64a-c99b-4700-aef6-23a77794a308\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2jl6z" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.069005 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/b695978e-b67c-4812-9083-22538cdd3045-mountpoint-dir\") pod \"csi-hostpathplugin-2pgbl\" (UID: \"b695978e-b67c-4812-9083-22538cdd3045\") " pod="hostpath-provisioner/csi-hostpathplugin-2pgbl" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.069566 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/236033b2-63e5-43e2-a9b7-3549f0802c30-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-dk5lq\" (UID: \"236033b2-63e5-43e2-a9b7-3549f0802c30\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dk5lq" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.070479 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5a3e73bc-7747-47d5-958b-1adc974f20a9-proxy-tls\") pod \"machine-config-controller-84d6567774-l8n7r\" (UID: \"5a3e73bc-7747-47d5-958b-1adc974f20a9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l8n7r" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.071423 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0306bf0e-f0c1-4e47-b63e-909b979c5844-secret-volume\") pod \"collect-profiles-29567745-czvxj\" (UID: \"0306bf0e-f0c1-4e47-b63e-909b979c5844\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567745-czvxj" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.071431 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/3a049943-f882-47d7-ac4e-703eebca8103-profile-collector-cert\") pod \"catalog-operator-68c6474976-fb258\" (UID: \"3a049943-f882-47d7-ac4e-703eebca8103\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fb258" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.071620 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/b74d4447-5cdf-482d-bc97-81cc3f6f5f1c-node-bootstrap-token\") pod \"machine-config-server-qsvzc\" (UID: \"b74d4447-5cdf-482d-bc97-81cc3f6f5f1c\") " pod="openshift-machine-config-operator/machine-config-server-qsvzc" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.071977 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/119e0da8-6d9c-48cc-ab21-85a78ca95c5c-cert\") pod \"ingress-canary-k7m7b\" (UID: \"119e0da8-6d9c-48cc-ab21-85a78ca95c5c\") " pod="openshift-ingress-canary/ingress-canary-k7m7b" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.071994 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/3a049943-f882-47d7-ac4e-703eebca8103-srv-cert\") pod \"catalog-operator-68c6474976-fb258\" (UID: \"3a049943-f882-47d7-ac4e-703eebca8103\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fb258" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.072128 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/3f819915-64a0-4327-ac01-5ff842cbc592-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-646sv\" (UID: \"3f819915-64a0-4327-ac01-5ff842cbc592\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-646sv" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.072271 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af1a70ec-af98-44c7-a59c-2d5a6d5c1200-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-tg9t2\" (UID: \"af1a70ec-af98-44c7-a59c-2d5a6d5c1200\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-tg9t2" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.073170 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ce5ae64a-c99b-4700-aef6-23a77794a308-proxy-tls\") pod \"machine-config-operator-74547568cd-2jl6z\" (UID: \"ce5ae64a-c99b-4700-aef6-23a77794a308\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2jl6z" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.076509 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dd62f2fc-10ef-4a6e-80f1-0813f3a681bd-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-p5r46\" (UID: \"dd62f2fc-10ef-4a6e-80f1-0813f3a681bd\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-p5r46" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.077004 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6a66c652-318d-4fc7-9bff-a73d65b12966-metrics-tls\") pod \"dns-default-hncrq\" (UID: \"6a66c652-318d-4fc7-9bff-a73d65b12966\") " pod="openshift-dns/dns-default-hncrq" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.077251 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/c2d03fe9-f23e-4f3d-a281-b80ba7fa7f38-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-n8ttn\" (UID: \"c2d03fe9-f23e-4f3d-a281-b80ba7fa7f38\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-n8ttn" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.093566 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmxjl\" (UniqueName: \"kubernetes.io/projected/eb165aaf-36a6-4965-bebf-6a40e1695b94-kube-api-access-qmxjl\") pod \"downloads-7954f5f757-clz2m\" (UID: \"eb165aaf-36a6-4965-bebf-6a40e1695b94\") " pod="openshift-console/downloads-7954f5f757-clz2m" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.114075 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksnq6\" (UniqueName: \"kubernetes.io/projected/b13d8427-dcef-4925-92b6-0e6bf1aca8c8-kube-api-access-ksnq6\") pod \"router-default-5444994796-vds97\" (UID: \"b13d8427-dcef-4925-92b6-0e6bf1aca8c8\") " pod="openshift-ingress/router-default-5444994796-vds97" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.140987 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvp2w\" (UniqueName: \"kubernetes.io/projected/8f518bb5-0336-44df-ac60-174c8426974e-kube-api-access-xvp2w\") pod \"openshift-config-operator-7777fb866f-2ptjk\" (UID: \"8f518bb5-0336-44df-ac60-174c8426974e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2ptjk" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.152935 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 03:49:46 crc kubenswrapper[4685]: E0321 03:49:46.153109 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 03:49:46.653089038 +0000 UTC m=+219.130157830 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.153425 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:46 crc kubenswrapper[4685]: E0321 03:49:46.153680 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 03:49:46.653670836 +0000 UTC m=+219.130739628 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pjvx8" (UID: "5a823511-d878-4e6d-acda-4202e00e3aab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.159067 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2k542\" (UniqueName: \"kubernetes.io/projected/bb0bd79d-5459-4b35-adb8-eca9d0fb069c-kube-api-access-2k542\") pod \"authentication-operator-69f744f599-cd44l\" (UID: \"bb0bd79d-5459-4b35-adb8-eca9d0fb069c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cd44l" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.163041 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2ptjk" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.175404 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqdgc\" (UniqueName: \"kubernetes.io/projected/5a823511-d878-4e6d-acda-4202e00e3aab-kube-api-access-tqdgc\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.194473 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czl5l\" (UniqueName: \"kubernetes.io/projected/49f645cb-7805-4ded-9f3e-d43bdb3801a6-kube-api-access-czl5l\") pod \"console-f9d7485db-j8qrk\" (UID: \"49f645cb-7805-4ded-9f3e-d43bdb3801a6\") " pod="openshift-console/console-f9d7485db-j8qrk" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.195155 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2ckzq" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.203363 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-j8qrk" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.212523 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2efbba77-f5bd-48cb-a790-f4c3564acb75-bound-sa-token\") pod \"ingress-operator-5b745b69d9-jjffn\" (UID: \"2efbba77-f5bd-48cb-a790-f4c3564acb75\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jjffn" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.218587 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-djlmv" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.232988 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6tr2m" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.236076 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2s67x\" (UniqueName: \"kubernetes.io/projected/9e7908ff-df99-4827-8a79-ce0b7f0f5d80-kube-api-access-2s67x\") pod \"etcd-operator-b45778765-97584\" (UID: \"9e7908ff-df99-4827-8a79-ce0b7f0f5d80\") " pod="openshift-etcd-operator/etcd-operator-b45778765-97584" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.241109 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-p5r46" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.249339 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jjffn" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.254999 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-clz2m" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.255085 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 03:49:46 crc kubenswrapper[4685]: E0321 03:49:46.255485 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 03:49:46.755464731 +0000 UTC m=+219.232533533 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.255606 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:46 crc kubenswrapper[4685]: E0321 03:49:46.255915 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 03:49:46.755908074 +0000 UTC m=+219.232976866 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pjvx8" (UID: "5a823511-d878-4e6d-acda-4202e00e3aab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.258745 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xtkn\" (UniqueName: \"kubernetes.io/projected/d841e9a0-5181-40b8-9374-daa38341c4ff-kube-api-access-6xtkn\") pod \"dns-operator-744455d44c-52mpk\" (UID: \"d841e9a0-5181-40b8-9374-daa38341c4ff\") " pod="openshift-dns-operator/dns-operator-744455d44c-52mpk" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.273187 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgn8b\" (UniqueName: \"kubernetes.io/projected/ca5cb48c-43cc-428f-bf99-ba396d595e5c-kube-api-access-rgn8b\") pod \"apiserver-7bbb656c7d-tb9dl\" (UID: \"ca5cb48c-43cc-428f-bf99-ba396d595e5c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tb9dl" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.297115 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-52mpk" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.298184 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-vds97" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.321031 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frlpd\" (UniqueName: \"kubernetes.io/projected/844b8e4f-7ba3-4a52-b6a7-0e8ea0d78003-kube-api-access-frlpd\") pod \"cluster-samples-operator-665b6dd947-2tvkz\" (UID: \"844b8e4f-7ba3-4a52-b6a7-0e8ea0d78003\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2tvkz" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.336056 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhvl9\" (UniqueName: \"kubernetes.io/projected/38f42eb2-0cf6-4c5b-8159-a89e22404a73-kube-api-access-zhvl9\") pod \"multus-admission-controller-857f4d67dd-n9mn4\" (UID: \"38f42eb2-0cf6-4c5b-8159-a89e22404a73\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-n9mn4" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.348362 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-n9mn4" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.353294 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9ffg\" (UniqueName: \"kubernetes.io/projected/f17fa271-26a6-4620-930b-f30d50f3412b-kube-api-access-q9ffg\") pod \"packageserver-d55dfcdfc-skpjj\" (UID: \"f17fa271-26a6-4620-930b-f30d50f3412b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-skpjj" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.357221 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 03:49:46 crc kubenswrapper[4685]: E0321 03:49:46.357799 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 03:49:46.857784332 +0000 UTC m=+219.334853124 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.375293 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmb7h\" (UniqueName: \"kubernetes.io/projected/b7a27ba4-0f5f-4ad6-9883-8698fd160802-kube-api-access-gmb7h\") pod \"cluster-image-registry-operator-dc59b4c8b-m9xxt\" (UID: \"b7a27ba4-0f5f-4ad6-9883-8698fd160802\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-m9xxt" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.398220 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fr6x\" (UniqueName: \"kubernetes.io/projected/b695978e-b67c-4812-9083-22538cdd3045-kube-api-access-2fr6x\") pod \"csi-hostpathplugin-2pgbl\" (UID: \"b695978e-b67c-4812-9083-22538cdd3045\") " pod="hostpath-provisioner/csi-hostpathplugin-2pgbl" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.418470 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phckr\" (UniqueName: \"kubernetes.io/projected/e3945afb-20d9-4312-b9bb-dbe5bc788cda-kube-api-access-phckr\") pod \"service-ca-9c57cc56f-qflxg\" (UID: \"e3945afb-20d9-4312-b9bb-dbe5bc788cda\") " pod="openshift-service-ca/service-ca-9c57cc56f-qflxg" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.427370 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-2ptjk"] Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.430673 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-cd44l" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.435437 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-2pgbl" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.435626 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgshs\" (UniqueName: \"kubernetes.io/projected/4ede3f08-f29b-4cb9-a96f-1c66239498f6-kube-api-access-kgshs\") pod \"auto-csr-approver-29567748-zv7h8\" (UID: \"4ede3f08-f29b-4cb9-a96f-1c66239498f6\") " pod="openshift-infra/auto-csr-approver-29567748-zv7h8" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.452815 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-j8qrk"] Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.458614 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:46 crc kubenswrapper[4685]: E0321 03:49:46.460513 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 03:49:46.960500164 +0000 UTC m=+219.437568956 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pjvx8" (UID: "5a823511-d878-4e6d-acda-4202e00e3aab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.466169 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjpgn\" (UniqueName: \"kubernetes.io/projected/3a049943-f882-47d7-ac4e-703eebca8103-kube-api-access-mjpgn\") pod \"catalog-operator-68c6474976-fb258\" (UID: \"3a049943-f882-47d7-ac4e-703eebca8103\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fb258" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.472517 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fb258" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.476596 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k45qr\" (UniqueName: \"kubernetes.io/projected/6a66c652-318d-4fc7-9bff-a73d65b12966-kube-api-access-k45qr\") pod \"dns-default-hncrq\" (UID: \"6a66c652-318d-4fc7-9bff-a73d65b12966\") " pod="openshift-dns/dns-default-hncrq" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.477731 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2tvkz" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.489664 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tb9dl" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.511316 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-97584" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.515250 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/af1a70ec-af98-44c7-a59c-2d5a6d5c1200-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-tg9t2\" (UID: \"af1a70ec-af98-44c7-a59c-2d5a6d5c1200\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-tg9t2" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.515994 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzkd6\" (UniqueName: \"kubernetes.io/projected/ea531307-d9b1-4798-bac8-d34094d27e2c-kube-api-access-bzkd6\") pod \"olm-operator-6b444d44fb-mlpft\" (UID: \"ea531307-d9b1-4798-bac8-d34094d27e2c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mlpft" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.542475 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rvf2\" (UniqueName: \"kubernetes.io/projected/b74d4447-5cdf-482d-bc97-81cc3f6f5f1c-kube-api-access-9rvf2\") pod \"machine-config-server-qsvzc\" (UID: \"b74d4447-5cdf-482d-bc97-81cc3f6f5f1c\") " pod="openshift-machine-config-operator/machine-config-server-qsvzc" Mar 21 03:49:46 crc kubenswrapper[4685]: W0321 03:49:46.551902 4685 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49f645cb_7805_4ded_9f3e_d43bdb3801a6.slice/crio-91c8a1aa6642b767de4ae0e791beeca5d50141199fd5726370850372642e8b39 WatchSource:0}: Error finding container 91c8a1aa6642b767de4ae0e791beeca5d50141199fd5726370850372642e8b39: Status 404 returned error can't find the container with id 91c8a1aa6642b767de4ae0e791beeca5d50141199fd5726370850372642e8b39 Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.562093 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 03:49:46 crc kubenswrapper[4685]: E0321 03:49:46.562229 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 03:49:47.062206577 +0000 UTC m=+219.539275369 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.562484 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:46 crc kubenswrapper[4685]: E0321 03:49:46.562778 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 03:49:47.062771084 +0000 UTC m=+219.539839876 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pjvx8" (UID: "5a823511-d878-4e6d-acda-4202e00e3aab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.563251 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6g665\" (UniqueName: \"kubernetes.io/projected/3f819915-64a0-4327-ac01-5ff842cbc592-kube-api-access-6g665\") pod \"control-plane-machine-set-operator-78cbb6b69f-646sv\" (UID: \"3f819915-64a0-4327-ac01-5ff842cbc592\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-646sv" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.569064 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-tg9t2" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.584459 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7lvv\" (UniqueName: \"kubernetes.io/projected/c2d03fe9-f23e-4f3d-a281-b80ba7fa7f38-kube-api-access-w7lvv\") pod \"package-server-manager-789f6589d5-n8ttn\" (UID: \"c2d03fe9-f23e-4f3d-a281-b80ba7fa7f38\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-n8ttn" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.613610 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kptqv\" (UniqueName: \"kubernetes.io/projected/119e0da8-6d9c-48cc-ab21-85a78ca95c5c-kube-api-access-kptqv\") pod \"ingress-canary-k7m7b\" (UID: \"119e0da8-6d9c-48cc-ab21-85a78ca95c5c\") " pod="openshift-ingress-canary/ingress-canary-k7m7b" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.625858 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6tr2m"] Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.627360 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-646sv" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.627487 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkt2m\" (UniqueName: \"kubernetes.io/projected/b0b74168-914c-4a2e-9122-c55d3bc3bcc2-kube-api-access-hkt2m\") pod \"marketplace-operator-79b997595-pcwhp\" (UID: \"b0b74168-914c-4a2e-9122-c55d3bc3bcc2\") " pod="openshift-marketplace/marketplace-operator-79b997595-pcwhp" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.632604 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mlpft" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.641053 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-skpjj" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.647060 4685 generic.go:334] "Generic (PLEG): container finished" podID="9c1c82f3-080b-47ea-93df-596d79aa2bf8" containerID="cff3a90a5d0dce0bdc1c08fa249dff8b0fa3ff3d690bcc40658298f4484977e5" exitCode=0 Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.647165 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-864h4" event={"ID":"9c1c82f3-080b-47ea-93df-596d79aa2bf8","Type":"ContainerDied","Data":"cff3a90a5d0dce0bdc1c08fa249dff8b0fa3ff3d690bcc40658298f4484977e5"} Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.655631 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-hncrq" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.658049 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b7a27ba4-0f5f-4ad6-9883-8698fd160802-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-m9xxt\" (UID: \"b7a27ba4-0f5f-4ad6-9883-8698fd160802\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-m9xxt" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.660460 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-flxks" event={"ID":"1ee95c71-cb75-4357-aeff-c0417a0c6eb3","Type":"ContainerStarted","Data":"07a34d133973de2b152c77aca886856de8c7a18eb607db88191c49497d04994c"} Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.660507 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-flxks" event={"ID":"1ee95c71-cb75-4357-aeff-c0417a0c6eb3","Type":"ContainerStarted","Data":"b643acd731ccb18f7b7782f8fbfc413895fecbf6d38ee2e3c764ace2524bcb88"} Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.660522 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-flxks" event={"ID":"1ee95c71-cb75-4357-aeff-c0417a0c6eb3","Type":"ContainerStarted","Data":"7ac29acaf92f0b10e901f5ccbd8418d1717025a63e3d5266f99a800bb30ba95e"} Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.660895 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxwm4\" (UniqueName: \"kubernetes.io/projected/081c0a2d-7b34-487c-b965-e29e9231ef7b-kube-api-access-qxwm4\") pod \"migrator-59844c95c7-5bwhq\" (UID: \"081c0a2d-7b34-487c-b965-e29e9231ef7b\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5bwhq" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.662539 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-m7q8h" event={"ID":"ce8529fe-4546-466d-bb07-3ee73cf1bc1f","Type":"ContainerStarted","Data":"6ac1c261b76689f8fe7cc196075da5de0b5071a522baff19ac6f7038191a2302"} Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.662568 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-m7q8h" event={"ID":"ce8529fe-4546-466d-bb07-3ee73cf1bc1f","Type":"ContainerStarted","Data":"e24c4fa3a42c700e29624cf05630b1d7f1435c08e4277b11c630c72f423b271a"} Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.663142 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-m7q8h" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.663255 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.663290 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-n8ttn" Mar 21 03:49:46 crc kubenswrapper[4685]: E0321 03:49:46.663733 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 03:49:47.163698942 +0000 UTC m=+219.640767734 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.665066 4685 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-m7q8h container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.9:6443/healthz\": dial tcp 10.217.0.9:6443: connect: connection refused" start-of-body= Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.665101 4685 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-m7q8h" podUID="ce8529fe-4546-466d-bb07-3ee73cf1bc1f" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.9:6443/healthz\": dial tcp 10.217.0.9:6443: connect: connection refused" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.666728 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-mkjd2" event={"ID":"cd1c8c06-710c-401b-803e-9cc18aa1b4b6","Type":"ContainerStarted","Data":"74e90300e77c4a10664ef3e64ba0c75e3b8d29360bcbddcadac288be6716b6ba"} Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.666753 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-mkjd2" event={"ID":"cd1c8c06-710c-401b-803e-9cc18aa1b4b6","Type":"ContainerStarted","Data":"25366635beebe6b2f8548a1976915b7b51b8c578a70ba4d2a290c186dc0de89c"} Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.667347 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-mkjd2" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.670414 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-qflxg" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.671239 4685 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-mkjd2 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.671281 4685 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-mkjd2" podUID="cd1c8c06-710c-401b-803e-9cc18aa1b4b6" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.676119 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-plggv" event={"ID":"588eb87c-d2c0-45fb-a0f7-33de36d5d745","Type":"ContainerStarted","Data":"1c65935116820cb8f53969211726b48113307e8d74c8fca574cb4bca5111d41d"} Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.677775 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-plggv" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.679639 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-j8qrk" event={"ID":"49f645cb-7805-4ded-9f3e-d43bdb3801a6","Type":"ContainerStarted","Data":"91c8a1aa6642b767de4ae0e791beeca5d50141199fd5726370850372642e8b39"} Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.682044 4685 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-plggv container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.682096 4685 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-plggv" podUID="588eb87c-d2c0-45fb-a0f7-33de36d5d745" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.683707 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-84ftw" event={"ID":"e6a042fb-9acf-4d69-9583-23ccf76753f8","Type":"ContainerStarted","Data":"aa719499d81ca0eb350c7a91341bfb6a25ebb4f81a3b59e9cf49cb25dda42a7e"} Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.684880 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfjj7\" (UniqueName: \"kubernetes.io/projected/236033b2-63e5-43e2-a9b7-3549f0802c30-kube-api-access-dfjj7\") pod \"kube-storage-version-migrator-operator-b67b599dd-dk5lq\" (UID: \"236033b2-63e5-43e2-a9b7-3549f0802c30\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dk5lq" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.687292 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-vds97" event={"ID":"b13d8427-dcef-4925-92b6-0e6bf1aca8c8","Type":"ContainerStarted","Data":"b8aa6d7d4837835e90ef01f71453c092de6a04db3cf9222361c58117b4506bb9"} Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.687324 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-vds97" event={"ID":"b13d8427-dcef-4925-92b6-0e6bf1aca8c8","Type":"ContainerStarted","Data":"dea195b191d538e2f938504880d3e462c5d6ff12b41ff298b571e1d1821aa220"} Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.691258 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2ptjk" event={"ID":"8f518bb5-0336-44df-ac60-174c8426974e","Type":"ContainerStarted","Data":"7cf39f914eb706536a9478281c4ea3e3323fd09e3eb0447a5512e1743f571047"} Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.698077 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mv8pp\" (UniqueName: \"kubernetes.io/projected/0306bf0e-f0c1-4e47-b63e-909b979c5844-kube-api-access-mv8pp\") pod \"collect-profiles-29567745-czvxj\" (UID: \"0306bf0e-f0c1-4e47-b63e-909b979c5844\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567745-czvxj" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.699288 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2ckzq"] Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.703055 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-djlmv"] Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.710522 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567748-zv7h8" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.712445 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zm64d\" (UniqueName: \"kubernetes.io/projected/ce5ae64a-c99b-4700-aef6-23a77794a308-kube-api-access-zm64d\") pod \"machine-config-operator-74547568cd-2jl6z\" (UID: \"ce5ae64a-c99b-4700-aef6-23a77794a308\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2jl6z" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.718661 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567745-czvxj" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.728277 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-qsvzc" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.735742 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzbh5\" (UniqueName: \"kubernetes.io/projected/4cc9f216-4352-4bff-a0eb-48659e3a603d-kube-api-access-tzbh5\") pod \"service-ca-operator-777779d784-vktqs\" (UID: \"4cc9f216-4352-4bff-a0eb-48659e3a603d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vktqs" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.756295 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-k7m7b" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.758486 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kg48m\" (UniqueName: \"kubernetes.io/projected/5a3e73bc-7747-47d5-958b-1adc974f20a9-kube-api-access-kg48m\") pod \"machine-config-controller-84d6567774-l8n7r\" (UID: \"5a3e73bc-7747-47d5-958b-1adc974f20a9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l8n7r" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.766136 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:46 crc kubenswrapper[4685]: E0321 03:49:46.768818 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 03:49:47.268804418 +0000 UTC m=+219.745873210 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pjvx8" (UID: "5a823511-d878-4e6d-acda-4202e00e3aab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.861644 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-m9xxt" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.875350 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 03:49:46 crc kubenswrapper[4685]: E0321 03:49:46.877765 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 03:49:47.377722979 +0000 UTC m=+219.854791771 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.880516 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:46 crc kubenswrapper[4685]: E0321 03:49:46.881084 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 03:49:47.3810691 +0000 UTC m=+219.858137892 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pjvx8" (UID: "5a823511-d878-4e6d-acda-4202e00e3aab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.890410 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2jl6z" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.896726 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-52mpk"] Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.898250 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dk5lq" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.909762 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5bwhq" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.914073 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l8n7r" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.916230 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-p5r46"] Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.923348 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-clz2m"] Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.925078 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-n9mn4"] Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.925361 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-pcwhp" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.937018 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-jjffn"] Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.977928 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vktqs" Mar 21 03:49:46 crc kubenswrapper[4685]: I0321 03:49:46.983044 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 03:49:46 crc kubenswrapper[4685]: E0321 03:49:46.983514 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 03:49:47.483485144 +0000 UTC m=+219.960553946 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:47 crc kubenswrapper[4685]: W0321 03:49:47.001443 4685 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38f42eb2_0cf6_4c5b_8159_a89e22404a73.slice/crio-4a5f6ba71522cfabb4fb51d028f0cabf962291348f692768c1bd325145a41716 WatchSource:0}: Error finding container 4a5f6ba71522cfabb4fb51d028f0cabf962291348f692768c1bd325145a41716: Status 404 returned error can't find the container with id 4a5f6ba71522cfabb4fb51d028f0cabf962291348f692768c1bd325145a41716 Mar 21 03:49:47 crc kubenswrapper[4685]: I0321 03:49:47.034480 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-2pgbl"] Mar 21 03:49:47 crc kubenswrapper[4685]: W0321 03:49:47.051091 4685 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb165aaf_36a6_4965_bebf_6a40e1695b94.slice/crio-6e6becb0051368015cc1a7b41a15fa151ff074593602704953f4c4ff71221f33 WatchSource:0}: Error finding container 6e6becb0051368015cc1a7b41a15fa151ff074593602704953f4c4ff71221f33: Status 404 returned error can't find the container with id 6e6becb0051368015cc1a7b41a15fa151ff074593602704953f4c4ff71221f33 Mar 21 03:49:47 crc kubenswrapper[4685]: I0321 03:49:47.053584 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-vds97" podStartSLOduration=177.053570698 podStartE2EDuration="2m57.053570698s" podCreationTimestamp="2026-03-21 03:46:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 03:49:47.052689001 +0000 UTC m=+219.529757803" watchObservedRunningTime="2026-03-21 03:49:47.053570698 +0000 UTC m=+219.530639490" Mar 21 03:49:47 crc kubenswrapper[4685]: I0321 03:49:47.053783 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fb258"] Mar 21 03:49:47 crc kubenswrapper[4685]: W0321 03:49:47.080505 4685 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd62f2fc_10ef_4a6e_80f1_0813f3a681bd.slice/crio-50984a8834ab7ccaf5f7fdef17bd3665b8caa11f99c711cc128908027d5701cf WatchSource:0}: Error finding container 50984a8834ab7ccaf5f7fdef17bd3665b8caa11f99c711cc128908027d5701cf: Status 404 returned error can't find the container with id 50984a8834ab7ccaf5f7fdef17bd3665b8caa11f99c711cc128908027d5701cf Mar 21 03:49:47 crc kubenswrapper[4685]: I0321 03:49:47.084891 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:47 crc kubenswrapper[4685]: E0321 03:49:47.085290 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 03:49:47.585277388 +0000 UTC m=+220.062346180 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pjvx8" (UID: "5a823511-d878-4e6d-acda-4202e00e3aab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:47 crc kubenswrapper[4685]: I0321 03:49:47.168831 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-cd44l"] Mar 21 03:49:47 crc kubenswrapper[4685]: I0321 03:49:47.174139 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-84ftw" podStartSLOduration=177.174126311 podStartE2EDuration="2m57.174126311s" podCreationTimestamp="2026-03-21 03:46:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 03:49:47.172616735 +0000 UTC m=+219.649685537" watchObservedRunningTime="2026-03-21 03:49:47.174126311 +0000 UTC m=+219.651195103" Mar 21 03:49:47 crc kubenswrapper[4685]: I0321 03:49:47.182715 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2tvkz"] Mar 21 03:49:47 crc kubenswrapper[4685]: I0321 03:49:47.186530 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 03:49:47 crc kubenswrapper[4685]: I0321 03:49:47.188261 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 03:49:47 crc kubenswrapper[4685]: E0321 03:49:47.188655 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 03:49:47.688628061 +0000 UTC m=+220.165696853 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:47 crc kubenswrapper[4685]: I0321 03:49:47.197299 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 03:49:47 crc kubenswrapper[4685]: I0321 03:49:47.203309 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-646sv"] Mar 21 03:49:47 crc kubenswrapper[4685]: I0321 03:49:47.237328 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-97584"] Mar 21 03:49:47 crc kubenswrapper[4685]: I0321 03:49:47.239864 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-tb9dl"] Mar 21 03:49:47 crc kubenswrapper[4685]: I0321 03:49:47.253717 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-mkjd2" podStartSLOduration=177.253701173 podStartE2EDuration="2m57.253701173s" podCreationTimestamp="2026-03-21 03:46:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 03:49:47.252023152 +0000 UTC m=+219.729091944" watchObservedRunningTime="2026-03-21 03:49:47.253701173 +0000 UTC m=+219.730769965" Mar 21 03:49:47 crc kubenswrapper[4685]: I0321 03:49:47.279058 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-tg9t2"] Mar 21 03:49:47 crc kubenswrapper[4685]: I0321 03:49:47.283797 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-skpjj"] Mar 21 03:49:47 crc kubenswrapper[4685]: I0321 03:49:47.289603 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 03:49:47 crc kubenswrapper[4685]: I0321 03:49:47.289659 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 03:49:47 crc kubenswrapper[4685]: I0321 03:49:47.289690 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fda9b1ff-e4a8-4d15-8f7b-2974991cd252-metrics-certs\") pod \"network-metrics-daemon-v9rdl\" (UID: \"fda9b1ff-e4a8-4d15-8f7b-2974991cd252\") " pod="openshift-multus/network-metrics-daemon-v9rdl" Mar 21 03:49:47 crc kubenswrapper[4685]: I0321 03:49:47.289712 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:47 crc kubenswrapper[4685]: I0321 03:49:47.289740 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 03:49:47 crc kubenswrapper[4685]: E0321 03:49:47.292079 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 03:49:47.790288081 +0000 UTC m=+220.267356873 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pjvx8" (UID: "5a823511-d878-4e6d-acda-4202e00e3aab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:47 crc kubenswrapper[4685]: I0321 03:49:47.292223 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 03:49:47 crc kubenswrapper[4685]: I0321 03:49:47.293302 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 03:49:47 crc kubenswrapper[4685]: I0321 03:49:47.298827 4685 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-vds97" Mar 21 03:49:47 crc kubenswrapper[4685]: I0321 03:49:47.301056 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fda9b1ff-e4a8-4d15-8f7b-2974991cd252-metrics-certs\") pod \"network-metrics-daemon-v9rdl\" (UID: \"fda9b1ff-e4a8-4d15-8f7b-2974991cd252\") " pod="openshift-multus/network-metrics-daemon-v9rdl" Mar 21 03:49:47 crc kubenswrapper[4685]: I0321 03:49:47.306015 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 03:49:47 crc kubenswrapper[4685]: I0321 03:49:47.325501 4685 patch_prober.go:28] interesting pod/router-default-5444994796-vds97 container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Mar 21 03:49:47 crc kubenswrapper[4685]: I0321 03:49:47.325555 4685 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vds97" podUID="b13d8427-dcef-4925-92b6-0e6bf1aca8c8" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Mar 21 03:49:47 crc kubenswrapper[4685]: I0321 03:49:47.327410 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 03:49:47 crc kubenswrapper[4685]: I0321 03:49:47.344802 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 03:49:47 crc kubenswrapper[4685]: I0321 03:49:47.358411 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v9rdl" Mar 21 03:49:47 crc kubenswrapper[4685]: W0321 03:49:47.368731 4685 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb0bd79d_5459_4b35_adb8_eca9d0fb069c.slice/crio-d2f999cae8925af27a3164a54c8162b154fb94b7a5ae80ca2bbdf8cebc1afebf WatchSource:0}: Error finding container d2f999cae8925af27a3164a54c8162b154fb94b7a5ae80ca2bbdf8cebc1afebf: Status 404 returned error can't find the container with id d2f999cae8925af27a3164a54c8162b154fb94b7a5ae80ca2bbdf8cebc1afebf Mar 21 03:49:47 crc kubenswrapper[4685]: I0321 03:49:47.371274 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 03:49:47 crc kubenswrapper[4685]: I0321 03:49:47.421068 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 03:49:47 crc kubenswrapper[4685]: E0321 03:49:47.441038 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 03:49:47.941015199 +0000 UTC m=+220.418083991 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:47 crc kubenswrapper[4685]: I0321 03:49:47.458341 4685 ???:1] "http: TLS handshake error from 192.168.126.11:35772: no serving certificate available for the kubelet" Mar 21 03:49:47 crc kubenswrapper[4685]: I0321 03:49:47.523424 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:47 crc kubenswrapper[4685]: E0321 03:49:47.523965 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 03:49:48.023953803 +0000 UTC m=+220.501022595 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pjvx8" (UID: "5a823511-d878-4e6d-acda-4202e00e3aab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:47 crc kubenswrapper[4685]: I0321 03:49:47.557644 4685 ???:1] "http: TLS handshake error from 192.168.126.11:35774: no serving certificate available for the kubelet" Mar 21 03:49:47 crc kubenswrapper[4685]: I0321 03:49:47.624399 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 03:49:47 crc kubenswrapper[4685]: E0321 03:49:47.624542 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 03:49:48.12452085 +0000 UTC m=+220.601589642 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:47 crc kubenswrapper[4685]: I0321 03:49:47.624648 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:47 crc kubenswrapper[4685]: E0321 03:49:47.625497 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 03:49:48.12548907 +0000 UTC m=+220.602557862 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pjvx8" (UID: "5a823511-d878-4e6d-acda-4202e00e3aab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:47 crc kubenswrapper[4685]: I0321 03:49:47.654569 4685 ???:1] "http: TLS handshake error from 192.168.126.11:35788: no serving certificate available for the kubelet" Mar 21 03:49:47 crc kubenswrapper[4685]: I0321 03:49:47.716690 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-hncrq"] Mar 21 03:49:47 crc kubenswrapper[4685]: I0321 03:49:47.716971 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mlpft"] Mar 21 03:49:47 crc kubenswrapper[4685]: I0321 03:49:47.718592 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-qflxg"] Mar 21 03:49:47 crc kubenswrapper[4685]: I0321 03:49:47.724207 4685 generic.go:334] "Generic (PLEG): container finished" podID="8f518bb5-0336-44df-ac60-174c8426974e" containerID="b663ad822c25a362e4d9a3220371a9ac152c3d77942d28bc77f7c7178c160334" exitCode=0 Mar 21 03:49:47 crc kubenswrapper[4685]: I0321 03:49:47.724287 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2ptjk" event={"ID":"8f518bb5-0336-44df-ac60-174c8426974e","Type":"ContainerDied","Data":"b663ad822c25a362e4d9a3220371a9ac152c3d77942d28bc77f7c7178c160334"} Mar 21 03:49:47 crc kubenswrapper[4685]: I0321 03:49:47.725199 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 03:49:47 crc kubenswrapper[4685]: E0321 03:49:47.725456 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 03:49:48.225442039 +0000 UTC m=+220.702510821 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:47 crc kubenswrapper[4685]: I0321 03:49:47.736035 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2tvkz" event={"ID":"844b8e4f-7ba3-4a52-b6a7-0e8ea0d78003","Type":"ContainerStarted","Data":"0653dd5e8b11be45faa45394993241a0f5c22bda3fed4f177047d4f67422b80a"} Mar 21 03:49:47 crc kubenswrapper[4685]: I0321 03:49:47.737245 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-plggv" podStartSLOduration=177.737231086 podStartE2EDuration="2m57.737231086s" podCreationTimestamp="2026-03-21 03:46:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 03:49:47.73539628 +0000 UTC m=+220.212465072" watchObservedRunningTime="2026-03-21 03:49:47.737231086 +0000 UTC m=+220.214299878" Mar 21 03:49:47 crc kubenswrapper[4685]: I0321 03:49:47.744776 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-cd44l" event={"ID":"bb0bd79d-5459-4b35-adb8-eca9d0fb069c","Type":"ContainerStarted","Data":"d2f999cae8925af27a3164a54c8162b154fb94b7a5ae80ca2bbdf8cebc1afebf"} Mar 21 03:49:47 crc kubenswrapper[4685]: I0321 03:49:47.746894 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-clz2m" event={"ID":"eb165aaf-36a6-4965-bebf-6a40e1695b94","Type":"ContainerStarted","Data":"6e6becb0051368015cc1a7b41a15fa151ff074593602704953f4c4ff71221f33"} Mar 21 03:49:47 crc kubenswrapper[4685]: I0321 03:49:47.748678 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-864h4" event={"ID":"9c1c82f3-080b-47ea-93df-596d79aa2bf8","Type":"ContainerStarted","Data":"e9cd9f7246c4bdafa240c07e95a3ab216df9b4ea8bf34ebebc9d856321caf4f6"} Mar 21 03:49:47 crc kubenswrapper[4685]: I0321 03:49:47.751169 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tb9dl" event={"ID":"ca5cb48c-43cc-428f-bf99-ba396d595e5c","Type":"ContainerStarted","Data":"279daab85098f7acbf5c13e15aa192d0950d9d0fa03d56c17529484c25c7a434"} Mar 21 03:49:47 crc kubenswrapper[4685]: I0321 03:49:47.760648 4685 ???:1] "http: TLS handshake error from 192.168.126.11:35794: no serving certificate available for the kubelet" Mar 21 03:49:47 crc kubenswrapper[4685]: I0321 03:49:47.766589 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-n8ttn"] Mar 21 03:49:47 crc kubenswrapper[4685]: I0321 03:49:47.770908 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jjffn" event={"ID":"2efbba77-f5bd-48cb-a790-f4c3564acb75","Type":"ContainerStarted","Data":"88ae18eb7bdf2e40b2daa3ae03637d874a81f4518ba4073b0d5762ede4a962a7"} Mar 21 03:49:47 crc kubenswrapper[4685]: I0321 03:49:47.770954 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jjffn" event={"ID":"2efbba77-f5bd-48cb-a790-f4c3564acb75","Type":"ContainerStarted","Data":"1a83142214c617770de65a47261318614af4de14c7b40f4c3996d1356e8ebd05"} Mar 21 03:49:47 crc kubenswrapper[4685]: I0321 03:49:47.781109 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-l8n7r"] Mar 21 03:49:47 crc kubenswrapper[4685]: I0321 03:49:47.784312 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-2pgbl" event={"ID":"b695978e-b67c-4812-9083-22538cdd3045","Type":"ContainerStarted","Data":"854c5082b92b6564f3b7d559958a9f7cd28e3faacd7dbcc7929c2eba62fb2536"} Mar 21 03:49:47 crc kubenswrapper[4685]: I0321 03:49:47.788176 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567748-zv7h8"] Mar 21 03:49:47 crc kubenswrapper[4685]: I0321 03:49:47.788494 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-97584" event={"ID":"9e7908ff-df99-4827-8a79-ce0b7f0f5d80","Type":"ContainerStarted","Data":"a364b0b68ad02aac20da65da784fdd6d5b60691bb8bfe571f6b39974466c2cd0"} Mar 21 03:49:47 crc kubenswrapper[4685]: I0321 03:49:47.802014 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-j8qrk" event={"ID":"49f645cb-7805-4ded-9f3e-d43bdb3801a6","Type":"ContainerStarted","Data":"16f5f48d40f4056847eb51680687538c9d270615ab5d4e703c55e658781b31b5"} Mar 21 03:49:47 crc kubenswrapper[4685]: I0321 03:49:47.805860 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-djlmv" event={"ID":"7b429d7c-3700-4d9f-b6b2-554219223515","Type":"ContainerStarted","Data":"8817cea45757fa2ffdfd786a8d4f094a8fda2bfb32f0056038adad0f4209c418"} Mar 21 03:49:47 crc kubenswrapper[4685]: I0321 03:49:47.805898 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-djlmv" event={"ID":"7b429d7c-3700-4d9f-b6b2-554219223515","Type":"ContainerStarted","Data":"7843bfc31542beff83d7b4ddf6b6c31c1d910fd3eee7dab1be5c1a8329605a3f"} Mar 21 03:49:47 crc kubenswrapper[4685]: I0321 03:49:47.806763 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-djlmv" Mar 21 03:49:47 crc kubenswrapper[4685]: I0321 03:49:47.818946 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-52mpk" event={"ID":"d841e9a0-5181-40b8-9374-daa38341c4ff","Type":"ContainerStarted","Data":"95d29e62e3fdcc9fbf3d03bee7329fb9f64dd10f9784d27ca72f733c495c1c4b"} Mar 21 03:49:47 crc kubenswrapper[4685]: I0321 03:49:47.825127 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2ckzq" event={"ID":"2663314f-3e35-4ab4-b9b9-28e829cde5de","Type":"ContainerStarted","Data":"e1eb60442899d9fd2bf24110138c43d493ec841e6dd7329362eeb915e3d14c0b"} Mar 21 03:49:47 crc kubenswrapper[4685]: I0321 03:49:47.825184 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2ckzq" event={"ID":"2663314f-3e35-4ab4-b9b9-28e829cde5de","Type":"ContainerStarted","Data":"43bcc9767594baa49b53c7201c3fe5630780ffd41eab924f04603d69ce9c7e11"} Mar 21 03:49:47 crc kubenswrapper[4685]: I0321 03:49:47.825931 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:47 crc kubenswrapper[4685]: E0321 03:49:47.826223 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 03:49:48.326213323 +0000 UTC m=+220.803282115 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pjvx8" (UID: "5a823511-d878-4e6d-acda-4202e00e3aab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:47 crc kubenswrapper[4685]: I0321 03:49:47.828428 4685 patch_prober.go:28] interesting pod/console-operator-58897d9998-djlmv container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.15:8443/readyz\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Mar 21 03:49:47 crc kubenswrapper[4685]: I0321 03:49:47.828471 4685 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-djlmv" podUID="7b429d7c-3700-4d9f-b6b2-554219223515" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.15:8443/readyz\": dial tcp 10.217.0.15:8443: connect: connection refused" Mar 21 03:49:47 crc kubenswrapper[4685]: I0321 03:49:47.834074 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6tr2m" event={"ID":"7c21a705-7e47-4418-803a-41a459acef90","Type":"ContainerStarted","Data":"3a11dd4471d4c201edfa7a4b6b4ca4a27dd68e3fe5a6a065b8034f3e11f43f83"} Mar 21 03:49:47 crc kubenswrapper[4685]: I0321 03:49:47.844080 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-qsvzc" event={"ID":"b74d4447-5cdf-482d-bc97-81cc3f6f5f1c","Type":"ContainerStarted","Data":"1cea4a262c4b0dfbfa8e2351e5aab7ff861262747b02bb106773f6b1d605a1e0"} Mar 21 03:49:47 crc kubenswrapper[4685]: I0321 03:49:47.844211 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-2jl6z"] Mar 21 03:49:47 crc kubenswrapper[4685]: I0321 03:49:47.858206 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-646sv" event={"ID":"3f819915-64a0-4327-ac01-5ff842cbc592","Type":"ContainerStarted","Data":"9bd7e44658145c0359156666d0d66564d474ddd7897bf52993aef06374ec3f02"} Mar 21 03:49:47 crc kubenswrapper[4685]: I0321 03:49:47.860697 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567745-czvxj"] Mar 21 03:49:47 crc kubenswrapper[4685]: I0321 03:49:47.863310 4685 ???:1] "http: TLS handshake error from 192.168.126.11:35808: no serving certificate available for the kubelet" Mar 21 03:49:47 crc kubenswrapper[4685]: I0321 03:49:47.865175 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-p5r46" event={"ID":"dd62f2fc-10ef-4a6e-80f1-0813f3a681bd","Type":"ContainerStarted","Data":"50984a8834ab7ccaf5f7fdef17bd3665b8caa11f99c711cc128908027d5701cf"} Mar 21 03:49:47 crc kubenswrapper[4685]: I0321 03:49:47.879954 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-k7m7b"] Mar 21 03:49:47 crc kubenswrapper[4685]: I0321 03:49:47.890663 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-n9mn4" event={"ID":"38f42eb2-0cf6-4c5b-8159-a89e22404a73","Type":"ContainerStarted","Data":"4a5f6ba71522cfabb4fb51d028f0cabf962291348f692768c1bd325145a41716"} Mar 21 03:49:47 crc kubenswrapper[4685]: I0321 03:49:47.913769 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-tg9t2" event={"ID":"af1a70ec-af98-44c7-a59c-2d5a6d5c1200","Type":"ContainerStarted","Data":"7293a729e127263564cebcf2e5b1f2e57d788cedbc385992d41bb0ffef590d08"} Mar 21 03:49:47 crc kubenswrapper[4685]: I0321 03:49:47.923890 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-skpjj" event={"ID":"f17fa271-26a6-4620-930b-f30d50f3412b","Type":"ContainerStarted","Data":"21ea7a58de729fedb8dbb1621411fd255a2e08fdac1d8f8a15fe1dabb4791f7b"} Mar 21 03:49:47 crc kubenswrapper[4685]: I0321 03:49:47.927651 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 03:49:47 crc kubenswrapper[4685]: E0321 03:49:47.928720 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 03:49:48.428702069 +0000 UTC m=+220.905770861 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:47 crc kubenswrapper[4685]: I0321 03:49:47.932218 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fb258" event={"ID":"3a049943-f882-47d7-ac4e-703eebca8103","Type":"ContainerStarted","Data":"8da1aec1a8348adfa66812c8d9eab5485e41ee2b9455ad6832706f2ab66b6a59"} Mar 21 03:49:47 crc kubenswrapper[4685]: I0321 03:49:47.933721 4685 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-mkjd2 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Mar 21 03:49:47 crc kubenswrapper[4685]: I0321 03:49:47.933764 4685 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-mkjd2" podUID="cd1c8c06-710c-401b-803e-9cc18aa1b4b6" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" Mar 21 03:49:47 crc kubenswrapper[4685]: I0321 03:49:47.939855 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-m9xxt"] Mar 21 03:49:47 crc kubenswrapper[4685]: I0321 03:49:47.953616 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-plggv" Mar 21 03:49:47 crc kubenswrapper[4685]: I0321 03:49:47.980828 4685 ???:1] "http: TLS handshake error from 192.168.126.11:35820: no serving certificate available for the kubelet" Mar 21 03:49:47 crc kubenswrapper[4685]: I0321 03:49:47.984258 4685 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 21 03:49:48 crc kubenswrapper[4685]: I0321 03:49:48.030167 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:48 crc kubenswrapper[4685]: E0321 03:49:48.051512 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 03:49:48.55149144 +0000 UTC m=+221.028560232 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pjvx8" (UID: "5a823511-d878-4e6d-acda-4202e00e3aab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:48 crc kubenswrapper[4685]: I0321 03:49:48.071471 4685 ???:1] "http: TLS handshake error from 192.168.126.11:35834: no serving certificate available for the kubelet" Mar 21 03:49:48 crc kubenswrapper[4685]: I0321 03:49:48.073452 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dk5lq"] Mar 21 03:49:48 crc kubenswrapper[4685]: I0321 03:49:48.089653 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-vktqs"] Mar 21 03:49:48 crc kubenswrapper[4685]: I0321 03:49:48.099098 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-pcwhp"] Mar 21 03:49:48 crc kubenswrapper[4685]: I0321 03:49:48.109306 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-5bwhq"] Mar 21 03:49:48 crc kubenswrapper[4685]: I0321 03:49:48.133737 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 03:49:48 crc kubenswrapper[4685]: E0321 03:49:48.134303 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 03:49:48.634286789 +0000 UTC m=+221.111355581 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:48 crc kubenswrapper[4685]: I0321 03:49:48.134436 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:48 crc kubenswrapper[4685]: E0321 03:49:48.135041 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 03:49:48.635020751 +0000 UTC m=+221.112089543 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pjvx8" (UID: "5a823511-d878-4e6d-acda-4202e00e3aab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:48 crc kubenswrapper[4685]: I0321 03:49:48.176386 4685 ???:1] "http: TLS handshake error from 192.168.126.11:35838: no serving certificate available for the kubelet" Mar 21 03:49:48 crc kubenswrapper[4685]: I0321 03:49:48.236364 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 03:49:48 crc kubenswrapper[4685]: E0321 03:49:48.236558 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 03:49:48.736511457 +0000 UTC m=+221.213580259 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:48 crc kubenswrapper[4685]: I0321 03:49:48.239908 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:48 crc kubenswrapper[4685]: E0321 03:49:48.240452 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 03:49:48.740437536 +0000 UTC m=+221.217506328 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pjvx8" (UID: "5a823511-d878-4e6d-acda-4202e00e3aab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:48 crc kubenswrapper[4685]: W0321 03:49:48.250049 4685 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0b74168_914c_4a2e_9122_c55d3bc3bcc2.slice/crio-3e485c3e61eb11f68474a81f868186ce97eaccf2ea2e9b951dc92852fd082d7a WatchSource:0}: Error finding container 3e485c3e61eb11f68474a81f868186ce97eaccf2ea2e9b951dc92852fd082d7a: Status 404 returned error can't find the container with id 3e485c3e61eb11f68474a81f868186ce97eaccf2ea2e9b951dc92852fd082d7a Mar 21 03:49:48 crc kubenswrapper[4685]: I0321 03:49:48.253114 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-m7q8h" podStartSLOduration=178.253095009 podStartE2EDuration="2m58.253095009s" podCreationTimestamp="2026-03-21 03:46:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 03:49:48.252753789 +0000 UTC m=+220.729822581" watchObservedRunningTime="2026-03-21 03:49:48.253095009 +0000 UTC m=+220.730163801" Mar 21 03:49:48 crc kubenswrapper[4685]: W0321 03:49:48.275632 4685 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-93d966695f32f22c9fd4b3ac90bde132b0deb3910512df0b1ec954e23d4a877d WatchSource:0}: Error finding container 93d966695f32f22c9fd4b3ac90bde132b0deb3910512df0b1ec954e23d4a877d: Status 404 returned error can't find the container with id 93d966695f32f22c9fd4b3ac90bde132b0deb3910512df0b1ec954e23d4a877d Mar 21 03:49:48 crc kubenswrapper[4685]: I0321 03:49:48.294096 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n6cmp" podStartSLOduration=178.294074481 podStartE2EDuration="2m58.294074481s" podCreationTimestamp="2026-03-21 03:46:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 03:49:48.291919436 +0000 UTC m=+220.768988228" watchObservedRunningTime="2026-03-21 03:49:48.294074481 +0000 UTC m=+220.771143263" Mar 21 03:49:48 crc kubenswrapper[4685]: I0321 03:49:48.299976 4685 patch_prober.go:28] interesting pod/router-default-5444994796-vds97 container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Mar 21 03:49:48 crc kubenswrapper[4685]: I0321 03:49:48.300026 4685 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vds97" podUID="b13d8427-dcef-4925-92b6-0e6bf1aca8c8" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Mar 21 03:49:48 crc kubenswrapper[4685]: I0321 03:49:48.342235 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 03:49:48 crc kubenswrapper[4685]: E0321 03:49:48.342519 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 03:49:48.842503029 +0000 UTC m=+221.319571821 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:48 crc kubenswrapper[4685]: I0321 03:49:48.351123 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-m7q8h" Mar 21 03:49:48 crc kubenswrapper[4685]: I0321 03:49:48.353219 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-v9rdl"] Mar 21 03:49:48 crc kubenswrapper[4685]: I0321 03:49:48.374955 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-flxks" podStartSLOduration=178.374938472 podStartE2EDuration="2m58.374938472s" podCreationTimestamp="2026-03-21 03:46:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 03:49:48.372714924 +0000 UTC m=+220.849783716" watchObservedRunningTime="2026-03-21 03:49:48.374938472 +0000 UTC m=+220.852007264" Mar 21 03:49:48 crc kubenswrapper[4685]: I0321 03:49:48.445089 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:48 crc kubenswrapper[4685]: E0321 03:49:48.446570 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 03:49:48.946559312 +0000 UTC m=+221.423628104 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pjvx8" (UID: "5a823511-d878-4e6d-acda-4202e00e3aab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:48 crc kubenswrapper[4685]: W0321 03:49:48.517407 4685 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfda9b1ff_e4a8_4d15_8f7b_2974991cd252.slice/crio-fcc48248acb261059e42eccc92fbfe1e6d22d9877d63d36e42875b29fd5f4da2 WatchSource:0}: Error finding container fcc48248acb261059e42eccc92fbfe1e6d22d9877d63d36e42875b29fd5f4da2: Status 404 returned error can't find the container with id fcc48248acb261059e42eccc92fbfe1e6d22d9877d63d36e42875b29fd5f4da2 Mar 21 03:49:48 crc kubenswrapper[4685]: I0321 03:49:48.546374 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 03:49:48 crc kubenswrapper[4685]: E0321 03:49:48.546512 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 03:49:49.046491861 +0000 UTC m=+221.523560643 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:48 crc kubenswrapper[4685]: I0321 03:49:48.546563 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:48 crc kubenswrapper[4685]: E0321 03:49:48.546883 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 03:49:49.046873352 +0000 UTC m=+221.523942144 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pjvx8" (UID: "5a823511-d878-4e6d-acda-4202e00e3aab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:48 crc kubenswrapper[4685]: I0321 03:49:48.648264 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 03:49:48 crc kubenswrapper[4685]: E0321 03:49:48.648594 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 03:49:49.148580685 +0000 UTC m=+221.625649477 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:48 crc kubenswrapper[4685]: I0321 03:49:48.666214 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2ckzq" podStartSLOduration=178.666195478 podStartE2EDuration="2m58.666195478s" podCreationTimestamp="2026-03-21 03:46:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 03:49:48.664878978 +0000 UTC m=+221.141947780" watchObservedRunningTime="2026-03-21 03:49:48.666195478 +0000 UTC m=+221.143264270" Mar 21 03:49:48 crc kubenswrapper[4685]: I0321 03:49:48.750823 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:48 crc kubenswrapper[4685]: E0321 03:49:48.751125 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 03:49:49.251112881 +0000 UTC m=+221.728181673 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pjvx8" (UID: "5a823511-d878-4e6d-acda-4202e00e3aab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:48 crc kubenswrapper[4685]: W0321 03:49:48.831002 4685 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-aa785eefba01cd12d27a932568e876e4637e25511cd061b51898c05913c67203 WatchSource:0}: Error finding container aa785eefba01cd12d27a932568e876e4637e25511cd061b51898c05913c67203: Status 404 returned error can't find the container with id aa785eefba01cd12d27a932568e876e4637e25511cd061b51898c05913c67203 Mar 21 03:49:48 crc kubenswrapper[4685]: I0321 03:49:48.844610 4685 ???:1] "http: TLS handshake error from 192.168.126.11:35854: no serving certificate available for the kubelet" Mar 21 03:49:48 crc kubenswrapper[4685]: I0321 03:49:48.851291 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 03:49:48 crc kubenswrapper[4685]: W0321 03:49:48.851384 4685 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-d4a1d1ff3eb605f0532d1504373425fc167baf6eaf094be1a29b17111fd238c3 WatchSource:0}: Error finding container d4a1d1ff3eb605f0532d1504373425fc167baf6eaf094be1a29b17111fd238c3: Status 404 returned error can't find the container with id d4a1d1ff3eb605f0532d1504373425fc167baf6eaf094be1a29b17111fd238c3 Mar 21 03:49:48 crc kubenswrapper[4685]: E0321 03:49:48.851436 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 03:49:49.351417111 +0000 UTC m=+221.828485903 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:48 crc kubenswrapper[4685]: I0321 03:49:48.851498 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:48 crc kubenswrapper[4685]: E0321 03:49:48.851828 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 03:49:49.351818213 +0000 UTC m=+221.828887005 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pjvx8" (UID: "5a823511-d878-4e6d-acda-4202e00e3aab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:48 crc kubenswrapper[4685]: I0321 03:49:48.944784 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-skpjj" event={"ID":"f17fa271-26a6-4620-930b-f30d50f3412b","Type":"ContainerStarted","Data":"b25aa2e5b69194b77482887a5944bb168440e4289af4718b2425616c1dbb7f32"} Mar 21 03:49:48 crc kubenswrapper[4685]: I0321 03:49:48.948357 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-pcwhp" event={"ID":"b0b74168-914c-4a2e-9122-c55d3bc3bcc2","Type":"ContainerStarted","Data":"3e485c3e61eb11f68474a81f868186ce97eaccf2ea2e9b951dc92852fd082d7a"} Mar 21 03:49:48 crc kubenswrapper[4685]: I0321 03:49:48.949191 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-m9xxt" event={"ID":"b7a27ba4-0f5f-4ad6-9883-8698fd160802","Type":"ContainerStarted","Data":"d7740a8a594a899ac136a80a1afb1f4acebf6a6e122cedf9736005abdd5475be"} Mar 21 03:49:48 crc kubenswrapper[4685]: I0321 03:49:48.952464 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-v9rdl" event={"ID":"fda9b1ff-e4a8-4d15-8f7b-2974991cd252","Type":"ContainerStarted","Data":"fcc48248acb261059e42eccc92fbfe1e6d22d9877d63d36e42875b29fd5f4da2"} Mar 21 03:49:48 crc kubenswrapper[4685]: I0321 03:49:48.952969 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 03:49:48 crc kubenswrapper[4685]: E0321 03:49:48.953315 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 03:49:49.453294418 +0000 UTC m=+221.930363210 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:48 crc kubenswrapper[4685]: I0321 03:49:48.954162 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567745-czvxj" event={"ID":"0306bf0e-f0c1-4e47-b63e-909b979c5844","Type":"ContainerStarted","Data":"d892bfba09ee863f547cc620c295ae4e415a18c4f06c302e67c4123329179220"} Mar 21 03:49:48 crc kubenswrapper[4685]: I0321 03:49:48.958629 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6tr2m" event={"ID":"7c21a705-7e47-4418-803a-41a459acef90","Type":"ContainerStarted","Data":"9ce02a3e9e557081d65f2e53fe560e11921a663ac4ada807597931312f689364"} Mar 21 03:49:48 crc kubenswrapper[4685]: I0321 03:49:48.962808 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-k7m7b" event={"ID":"119e0da8-6d9c-48cc-ab21-85a78ca95c5c","Type":"ContainerStarted","Data":"7fb5b2392b88b6d105fc70d45b6cf4b7288bc9bb29e481355eabab555fe3bc2a"} Mar 21 03:49:48 crc kubenswrapper[4685]: I0321 03:49:48.966351 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-97584" event={"ID":"9e7908ff-df99-4827-8a79-ce0b7f0f5d80","Type":"ContainerStarted","Data":"9c7812b127b45f9aee70a4a432c088d34b8da2f38f69586ca75f81ec6eb51838"} Mar 21 03:49:48 crc kubenswrapper[4685]: I0321 03:49:48.969530 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"93d966695f32f22c9fd4b3ac90bde132b0deb3910512df0b1ec954e23d4a877d"} Mar 21 03:49:48 crc kubenswrapper[4685]: I0321 03:49:48.971017 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"d4a1d1ff3eb605f0532d1504373425fc167baf6eaf094be1a29b17111fd238c3"} Mar 21 03:49:48 crc kubenswrapper[4685]: I0321 03:49:48.977580 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dk5lq" event={"ID":"236033b2-63e5-43e2-a9b7-3549f0802c30","Type":"ContainerStarted","Data":"5a6c115c466f8c58deb5ea0607b5fcffb490da2e61c2fa39d20b3ae91bed0dbb"} Mar 21 03:49:48 crc kubenswrapper[4685]: I0321 03:49:48.997309 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l8n7r" event={"ID":"5a3e73bc-7747-47d5-958b-1adc974f20a9","Type":"ContainerStarted","Data":"06297d294b018e2ebd88de3767343446a96bc536d5581f608e5f9b16b4a9bb02"} Mar 21 03:49:49 crc kubenswrapper[4685]: I0321 03:49:49.009642 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567748-zv7h8" event={"ID":"4ede3f08-f29b-4cb9-a96f-1c66239498f6","Type":"ContainerStarted","Data":"2733a1272a0f7b10b0af660e7184383bdad8b10a00fc70bf8280bbe070f5e258"} Mar 21 03:49:49 crc kubenswrapper[4685]: I0321 03:49:49.010831 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-clz2m" event={"ID":"eb165aaf-36a6-4965-bebf-6a40e1695b94","Type":"ContainerStarted","Data":"c52fadbcdb29a76226d7a77a2018a2dd759b05b9a6978a5e6350437bd12b6fc8"} Mar 21 03:49:49 crc kubenswrapper[4685]: I0321 03:49:49.011640 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-clz2m" Mar 21 03:49:49 crc kubenswrapper[4685]: I0321 03:49:49.015375 4685 patch_prober.go:28] interesting pod/downloads-7954f5f757-clz2m container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.35:8080/\": dial tcp 10.217.0.35:8080: connect: connection refused" start-of-body= Mar 21 03:49:49 crc kubenswrapper[4685]: I0321 03:49:49.015433 4685 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-clz2m" podUID="eb165aaf-36a6-4965-bebf-6a40e1695b94" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.35:8080/\": dial tcp 10.217.0.35:8080: connect: connection refused" Mar 21 03:49:49 crc kubenswrapper[4685]: I0321 03:49:49.016849 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-52mpk" event={"ID":"d841e9a0-5181-40b8-9374-daa38341c4ff","Type":"ContainerStarted","Data":"f6d6bfe9bdcf441c872dacc85484662191f05e77174bf1ac5215ef4e5eb63ba9"} Mar 21 03:49:49 crc kubenswrapper[4685]: I0321 03:49:49.017504 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5bwhq" event={"ID":"081c0a2d-7b34-487c-b965-e29e9231ef7b","Type":"ContainerStarted","Data":"5dc5ef1b76e21328d3f35a9e09359dec68281a1ab3714106ec020e142330000b"} Mar 21 03:49:49 crc kubenswrapper[4685]: I0321 03:49:49.019950 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-n9mn4" event={"ID":"38f42eb2-0cf6-4c5b-8159-a89e22404a73","Type":"ContainerStarted","Data":"122f026383b1f9fa560e70029894a493d8fb5cd90a9727e9ebd60ec71773483f"} Mar 21 03:49:49 crc kubenswrapper[4685]: I0321 03:49:49.023072 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fb258" event={"ID":"3a049943-f882-47d7-ac4e-703eebca8103","Type":"ContainerStarted","Data":"daef2c7d6e63c025f2219931fa96d773899d62fcef9858e8a7de2d805e672b8f"} Mar 21 03:49:49 crc kubenswrapper[4685]: I0321 03:49:49.023118 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fb258" Mar 21 03:49:49 crc kubenswrapper[4685]: I0321 03:49:49.024009 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vktqs" event={"ID":"4cc9f216-4352-4bff-a0eb-48659e3a603d","Type":"ContainerStarted","Data":"105d94fd6326915ee9836581054696ae42903685ec3fe272a4533bd3c7be4d43"} Mar 21 03:49:49 crc kubenswrapper[4685]: I0321 03:49:49.026085 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2jl6z" event={"ID":"ce5ae64a-c99b-4700-aef6-23a77794a308","Type":"ContainerStarted","Data":"e8f7cf795d28b8b734d08d471650d080f71bd5b1a3e12731b91f80f09e75881f"} Mar 21 03:49:49 crc kubenswrapper[4685]: I0321 03:49:49.028547 4685 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-fb258 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.31:8443/healthz\": dial tcp 10.217.0.31:8443: connect: connection refused" start-of-body= Mar 21 03:49:49 crc kubenswrapper[4685]: I0321 03:49:49.028595 4685 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fb258" podUID="3a049943-f882-47d7-ac4e-703eebca8103" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.31:8443/healthz\": dial tcp 10.217.0.31:8443: connect: connection refused" Mar 21 03:49:49 crc kubenswrapper[4685]: I0321 03:49:49.029179 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hncrq" event={"ID":"6a66c652-318d-4fc7-9bff-a73d65b12966","Type":"ContainerStarted","Data":"2484f2929750599264f9e244415e3c332d4b9e6f6a6941f54c962c6f7b7c925a"} Mar 21 03:49:49 crc kubenswrapper[4685]: I0321 03:49:49.032532 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-cd44l" event={"ID":"bb0bd79d-5459-4b35-adb8-eca9d0fb069c","Type":"ContainerStarted","Data":"3b7b670aa8833601bf8185c8f1e0ae40860f3b82d8c9c8059a5001d22b16a5a2"} Mar 21 03:49:49 crc kubenswrapper[4685]: I0321 03:49:49.034138 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-646sv" event={"ID":"3f819915-64a0-4327-ac01-5ff842cbc592","Type":"ContainerStarted","Data":"322b4c9e5055fb4d56c39d9cfd1200c15cd152269a95f9d9f2d4dc611851f682"} Mar 21 03:49:49 crc kubenswrapper[4685]: I0321 03:49:49.046546 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-qsvzc" event={"ID":"b74d4447-5cdf-482d-bc97-81cc3f6f5f1c","Type":"ContainerStarted","Data":"01d48a02034f3bfedfb7c09a71848eefbdd402d7b1d8dc425521bf0008a092ec"} Mar 21 03:49:49 crc kubenswrapper[4685]: I0321 03:49:49.049077 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mlpft" event={"ID":"ea531307-d9b1-4798-bac8-d34094d27e2c","Type":"ContainerStarted","Data":"fe44f1040d49528b4151f46145fe4764346e7a66203686cab7acf3b3f7dbc526"} Mar 21 03:49:49 crc kubenswrapper[4685]: I0321 03:49:49.050680 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-j8qrk" podStartSLOduration=179.050669929 podStartE2EDuration="2m59.050669929s" podCreationTimestamp="2026-03-21 03:46:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 03:49:49.049797263 +0000 UTC m=+221.526866075" watchObservedRunningTime="2026-03-21 03:49:49.050669929 +0000 UTC m=+221.527738721" Mar 21 03:49:49 crc kubenswrapper[4685]: I0321 03:49:49.054159 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:49 crc kubenswrapper[4685]: E0321 03:49:49.055140 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 03:49:49.555125304 +0000 UTC m=+222.032194096 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pjvx8" (UID: "5a823511-d878-4e6d-acda-4202e00e3aab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:49 crc kubenswrapper[4685]: I0321 03:49:49.062662 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-qflxg" event={"ID":"e3945afb-20d9-4312-b9bb-dbe5bc788cda","Type":"ContainerStarted","Data":"55c6a6cf449e6788df541a37e5e3c6dfec238eae0d0ae6ee86a2592bc273458e"} Mar 21 03:49:49 crc kubenswrapper[4685]: I0321 03:49:49.064463 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"aa785eefba01cd12d27a932568e876e4637e25511cd061b51898c05913c67203"} Mar 21 03:49:49 crc kubenswrapper[4685]: I0321 03:49:49.067488 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-n8ttn" event={"ID":"c2d03fe9-f23e-4f3d-a281-b80ba7fa7f38","Type":"ContainerStarted","Data":"39e3c23a06f0e68d98ed0d9cc5928141acae043ca8aeefb24f6a25a591100a7d"} Mar 21 03:49:49 crc kubenswrapper[4685]: I0321 03:49:49.068175 4685 patch_prober.go:28] interesting pod/console-operator-58897d9998-djlmv container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.15:8443/readyz\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Mar 21 03:49:49 crc kubenswrapper[4685]: I0321 03:49:49.068233 4685 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-djlmv" podUID="7b429d7c-3700-4d9f-b6b2-554219223515" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.15:8443/readyz\": dial tcp 10.217.0.15:8443: connect: connection refused" Mar 21 03:49:49 crc kubenswrapper[4685]: I0321 03:49:49.098593 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-djlmv" podStartSLOduration=179.098577591 podStartE2EDuration="2m59.098577591s" podCreationTimestamp="2026-03-21 03:46:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 03:49:49.096379304 +0000 UTC m=+221.573448096" watchObservedRunningTime="2026-03-21 03:49:49.098577591 +0000 UTC m=+221.575646383" Mar 21 03:49:49 crc kubenswrapper[4685]: I0321 03:49:49.158941 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 03:49:49 crc kubenswrapper[4685]: E0321 03:49:49.159161 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 03:49:49.659142406 +0000 UTC m=+222.136211198 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:49 crc kubenswrapper[4685]: I0321 03:49:49.159406 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:49 crc kubenswrapper[4685]: E0321 03:49:49.163367 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 03:49:49.663348534 +0000 UTC m=+222.140417516 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pjvx8" (UID: "5a823511-d878-4e6d-acda-4202e00e3aab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:49 crc kubenswrapper[4685]: I0321 03:49:49.261643 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 03:49:49 crc kubenswrapper[4685]: E0321 03:49:49.261974 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 03:49:49.761958302 +0000 UTC m=+222.239027094 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:49 crc kubenswrapper[4685]: I0321 03:49:49.331293 4685 patch_prober.go:28] interesting pod/router-default-5444994796-vds97 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 03:49:49 crc kubenswrapper[4685]: [-]has-synced failed: reason withheld Mar 21 03:49:49 crc kubenswrapper[4685]: [+]process-running ok Mar 21 03:49:49 crc kubenswrapper[4685]: healthz check failed Mar 21 03:49:49 crc kubenswrapper[4685]: I0321 03:49:49.331601 4685 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vds97" podUID="b13d8427-dcef-4925-92b6-0e6bf1aca8c8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 03:49:49 crc kubenswrapper[4685]: I0321 03:49:49.352913 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-cd44l" podStartSLOduration=179.352899048 podStartE2EDuration="2m59.352899048s" podCreationTimestamp="2026-03-21 03:46:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 03:49:49.352154156 +0000 UTC m=+221.829222958" watchObservedRunningTime="2026-03-21 03:49:49.352899048 +0000 UTC m=+221.829967840" Mar 21 03:49:49 crc kubenswrapper[4685]: I0321 03:49:49.362942 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:49 crc kubenswrapper[4685]: E0321 03:49:49.363266 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 03:49:49.863252542 +0000 UTC m=+222.340321344 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pjvx8" (UID: "5a823511-d878-4e6d-acda-4202e00e3aab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:49 crc kubenswrapper[4685]: I0321 03:49:49.390607 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-qsvzc" podStartSLOduration=6.3905869 podStartE2EDuration="6.3905869s" podCreationTimestamp="2026-03-21 03:49:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 03:49:49.384345471 +0000 UTC m=+221.861414263" watchObservedRunningTime="2026-03-21 03:49:49.3905869 +0000 UTC m=+221.867655702" Mar 21 03:49:49 crc kubenswrapper[4685]: I0321 03:49:49.450661 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fb258" podStartSLOduration=179.45064552 podStartE2EDuration="2m59.45064552s" podCreationTimestamp="2026-03-21 03:46:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 03:49:49.450160556 +0000 UTC m=+221.927229348" watchObservedRunningTime="2026-03-21 03:49:49.45064552 +0000 UTC m=+221.927714312" Mar 21 03:49:49 crc kubenswrapper[4685]: I0321 03:49:49.452154 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6tr2m" podStartSLOduration=179.452145886 podStartE2EDuration="2m59.452145886s" podCreationTimestamp="2026-03-21 03:46:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 03:49:49.415206736 +0000 UTC m=+221.892275538" watchObservedRunningTime="2026-03-21 03:49:49.452145886 +0000 UTC m=+221.929214678" Mar 21 03:49:49 crc kubenswrapper[4685]: I0321 03:49:49.464441 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 03:49:49 crc kubenswrapper[4685]: E0321 03:49:49.464769 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 03:49:49.964755128 +0000 UTC m=+222.441823920 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:49 crc kubenswrapper[4685]: I0321 03:49:49.566670 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:49 crc kubenswrapper[4685]: E0321 03:49:49.566959 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 03:49:50.066948925 +0000 UTC m=+222.544017707 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pjvx8" (UID: "5a823511-d878-4e6d-acda-4202e00e3aab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:49 crc kubenswrapper[4685]: I0321 03:49:49.670202 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 03:49:49 crc kubenswrapper[4685]: E0321 03:49:49.670558 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 03:49:50.170542734 +0000 UTC m=+222.647611526 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:49 crc kubenswrapper[4685]: I0321 03:49:49.771405 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:49 crc kubenswrapper[4685]: E0321 03:49:49.771817 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 03:49:50.271800833 +0000 UTC m=+222.748869625 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pjvx8" (UID: "5a823511-d878-4e6d-acda-4202e00e3aab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:49 crc kubenswrapper[4685]: I0321 03:49:49.873001 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 03:49:49 crc kubenswrapper[4685]: E0321 03:49:49.873186 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 03:49:50.373159125 +0000 UTC m=+222.850227917 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:49 crc kubenswrapper[4685]: I0321 03:49:49.873373 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:49 crc kubenswrapper[4685]: E0321 03:49:49.873727 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 03:49:50.373713962 +0000 UTC m=+222.850782754 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pjvx8" (UID: "5a823511-d878-4e6d-acda-4202e00e3aab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:49 crc kubenswrapper[4685]: I0321 03:49:49.974616 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 03:49:49 crc kubenswrapper[4685]: E0321 03:49:49.974829 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 03:49:50.474800035 +0000 UTC m=+222.951868827 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:49 crc kubenswrapper[4685]: I0321 03:49:49.974888 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:49 crc kubenswrapper[4685]: E0321 03:49:49.975247 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 03:49:50.475236198 +0000 UTC m=+222.952305000 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pjvx8" (UID: "5a823511-d878-4e6d-acda-4202e00e3aab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:50 crc kubenswrapper[4685]: I0321 03:49:50.075089 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mlpft" event={"ID":"ea531307-d9b1-4798-bac8-d34094d27e2c","Type":"ContainerStarted","Data":"90437988724cfb543253d770e0690a45a69ec2f714b2486f8f31d799aee2ba9b"} Mar 21 03:49:50 crc kubenswrapper[4685]: I0321 03:49:50.076634 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mlpft" Mar 21 03:49:50 crc kubenswrapper[4685]: I0321 03:49:50.077050 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 03:49:50 crc kubenswrapper[4685]: E0321 03:49:50.077279 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 03:49:50.57725992 +0000 UTC m=+223.054328712 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:50 crc kubenswrapper[4685]: I0321 03:49:50.078707 4685 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-mlpft container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Mar 21 03:49:50 crc kubenswrapper[4685]: I0321 03:49:50.078780 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-k7m7b" event={"ID":"119e0da8-6d9c-48cc-ab21-85a78ca95c5c","Type":"ContainerStarted","Data":"c4a9ab867422bd7bf809f9e45574461e09ecc925f9dcaf82a659ea72b0667fac"} Mar 21 03:49:50 crc kubenswrapper[4685]: I0321 03:49:50.078797 4685 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mlpft" podUID="ea531307-d9b1-4798-bac8-d34094d27e2c" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" Mar 21 03:49:50 crc kubenswrapper[4685]: I0321 03:49:50.084498 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-pcwhp" event={"ID":"b0b74168-914c-4a2e-9122-c55d3bc3bcc2","Type":"ContainerStarted","Data":"8769ba1d49069d8d855eda602952eeb9e340b678969874cb89b2d49ff075670b"} Mar 21 03:49:50 crc kubenswrapper[4685]: I0321 03:49:50.084933 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-pcwhp" Mar 21 03:49:50 crc kubenswrapper[4685]: I0321 03:49:50.088638 4685 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-pcwhp container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.34:8080/healthz\": dial tcp 10.217.0.34:8080: connect: connection refused" start-of-body= Mar 21 03:49:50 crc kubenswrapper[4685]: I0321 03:49:50.088700 4685 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-pcwhp" podUID="b0b74168-914c-4a2e-9122-c55d3bc3bcc2" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.34:8080/healthz\": dial tcp 10.217.0.34:8080: connect: connection refused" Mar 21 03:49:50 crc kubenswrapper[4685]: I0321 03:49:50.095512 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mlpft" podStartSLOduration=180.095494973 podStartE2EDuration="3m0.095494973s" podCreationTimestamp="2026-03-21 03:46:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 03:49:50.094666488 +0000 UTC m=+222.571735290" watchObservedRunningTime="2026-03-21 03:49:50.095494973 +0000 UTC m=+222.572563775" Mar 21 03:49:50 crc kubenswrapper[4685]: I0321 03:49:50.096046 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-clz2m" podStartSLOduration=180.096039809 podStartE2EDuration="3m0.096039809s" podCreationTimestamp="2026-03-21 03:46:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 03:49:49.509122172 +0000 UTC m=+221.986190974" watchObservedRunningTime="2026-03-21 03:49:50.096039809 +0000 UTC m=+222.573108601" Mar 21 03:49:50 crc kubenswrapper[4685]: I0321 03:49:50.098203 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dk5lq" event={"ID":"236033b2-63e5-43e2-a9b7-3549f0802c30","Type":"ContainerStarted","Data":"2f8b5055baef9c0a661e363751842a37a965e7dfd68b740433a59aa4b111bc3d"} Mar 21 03:49:50 crc kubenswrapper[4685]: I0321 03:49:50.100756 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5bwhq" event={"ID":"081c0a2d-7b34-487c-b965-e29e9231ef7b","Type":"ContainerStarted","Data":"f29b30d46dc6e0ae72f2b93017f4eafd29ed2efcd04af63fd6ff8e783670a756"} Mar 21 03:49:50 crc kubenswrapper[4685]: I0321 03:49:50.105371 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2ptjk" event={"ID":"8f518bb5-0336-44df-ac60-174c8426974e","Type":"ContainerStarted","Data":"75b2cdf1e17799f5d0e0de2650ebac728a2c50656f4b338c49555454a64957b2"} Mar 21 03:49:50 crc kubenswrapper[4685]: I0321 03:49:50.105526 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2ptjk" Mar 21 03:49:50 crc kubenswrapper[4685]: I0321 03:49:50.107264 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-tg9t2" event={"ID":"af1a70ec-af98-44c7-a59c-2d5a6d5c1200","Type":"ContainerStarted","Data":"6725041c9672d2d1afed413c38eb78d78acd6b38868d4d3c4fd02e49c8578c3a"} Mar 21 03:49:50 crc kubenswrapper[4685]: I0321 03:49:50.109165 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-p5r46" event={"ID":"dd62f2fc-10ef-4a6e-80f1-0813f3a681bd","Type":"ContainerStarted","Data":"d7d7d4fc47503c3a42c8a9686d3fd0f568fc27c803c76641fde728c9f3fbb579"} Mar 21 03:49:50 crc kubenswrapper[4685]: I0321 03:49:50.110596 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2jl6z" event={"ID":"ce5ae64a-c99b-4700-aef6-23a77794a308","Type":"ContainerStarted","Data":"917a154eaea8d2ad633d7d6d18360d1a564711729927105636a9420d8231f036"} Mar 21 03:49:50 crc kubenswrapper[4685]: I0321 03:49:50.112037 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-qflxg" event={"ID":"e3945afb-20d9-4312-b9bb-dbe5bc788cda","Type":"ContainerStarted","Data":"1e8ce774c83e739a1735b02d878beea44f8b8260b9006d703e0baf8a422a66ec"} Mar 21 03:49:50 crc kubenswrapper[4685]: I0321 03:49:50.118985 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-k7m7b" podStartSLOduration=7.118965774 podStartE2EDuration="7.118965774s" podCreationTimestamp="2026-03-21 03:49:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 03:49:50.113454777 +0000 UTC m=+222.590523589" watchObservedRunningTime="2026-03-21 03:49:50.118965774 +0000 UTC m=+222.596034566" Mar 21 03:49:50 crc kubenswrapper[4685]: I0321 03:49:50.128235 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-864h4" event={"ID":"9c1c82f3-080b-47ea-93df-596d79aa2bf8","Type":"ContainerStarted","Data":"966fedcbfa06a0ffedccd76dfebc2cb45a4837c43bb0589a36f1a9352e7dc7a4"} Mar 21 03:49:50 crc kubenswrapper[4685]: I0321 03:49:50.131812 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-n8ttn" event={"ID":"c2d03fe9-f23e-4f3d-a281-b80ba7fa7f38","Type":"ContainerStarted","Data":"ab25b759b85cbfa430f9cd333f3cca78ddd51377c1696479e55e331bb3946917"} Mar 21 03:49:50 crc kubenswrapper[4685]: I0321 03:49:50.139737 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-m9xxt" event={"ID":"b7a27ba4-0f5f-4ad6-9883-8698fd160802","Type":"ContainerStarted","Data":"d74eb7f30a9619cb8af7dab6aa3a2aa867f452c049b5b2fc9c5f2043d52fea3d"} Mar 21 03:49:50 crc kubenswrapper[4685]: I0321 03:49:50.146262 4685 generic.go:334] "Generic (PLEG): container finished" podID="ca5cb48c-43cc-428f-bf99-ba396d595e5c" containerID="4d867597d3cd9bf0799a21cf3948a8a6f0e5269872faa0aae9bce98529ee8758" exitCode=0 Mar 21 03:49:50 crc kubenswrapper[4685]: I0321 03:49:50.146351 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tb9dl" event={"ID":"ca5cb48c-43cc-428f-bf99-ba396d595e5c","Type":"ContainerDied","Data":"4d867597d3cd9bf0799a21cf3948a8a6f0e5269872faa0aae9bce98529ee8758"} Mar 21 03:49:50 crc kubenswrapper[4685]: I0321 03:49:50.157725 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"b071bf17697fdbea557db3f868ad6bdc7f4fa00390a026bf109de9a64b41c0e4"} Mar 21 03:49:50 crc kubenswrapper[4685]: I0321 03:49:50.157867 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 03:49:50 crc kubenswrapper[4685]: I0321 03:49:50.164598 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l8n7r" event={"ID":"5a3e73bc-7747-47d5-958b-1adc974f20a9","Type":"ContainerStarted","Data":"e4613a973f0dded4c8fbdbd86fa0b9e18d4d863df2d284614ba6f3ae1bd256bb"} Mar 21 03:49:50 crc kubenswrapper[4685]: I0321 03:49:50.168115 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vktqs" event={"ID":"4cc9f216-4352-4bff-a0eb-48659e3a603d","Type":"ContainerStarted","Data":"d0844e6b0824bfd0f42678336b19c39511172769982a903d3c72f505894aef7b"} Mar 21 03:49:50 crc kubenswrapper[4685]: I0321 03:49:50.172273 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2tvkz" event={"ID":"844b8e4f-7ba3-4a52-b6a7-0e8ea0d78003","Type":"ContainerStarted","Data":"99734a3a384a93c13e78eee2247b720c42790f49ff81d43e0f4a6482220f0d00"} Mar 21 03:49:50 crc kubenswrapper[4685]: I0321 03:49:50.176687 4685 ???:1] "http: TLS handshake error from 192.168.126.11:35858: no serving certificate available for the kubelet" Mar 21 03:49:50 crc kubenswrapper[4685]: I0321 03:49:50.180278 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-pcwhp" podStartSLOduration=180.180258961 podStartE2EDuration="3m0.180258961s" podCreationTimestamp="2026-03-21 03:46:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 03:49:50.144701794 +0000 UTC m=+222.621770586" watchObservedRunningTime="2026-03-21 03:49:50.180258961 +0000 UTC m=+222.657327763" Mar 21 03:49:50 crc kubenswrapper[4685]: I0321 03:49:50.180645 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:50 crc kubenswrapper[4685]: E0321 03:49:50.183870 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 03:49:50.68385081 +0000 UTC m=+223.160919802 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pjvx8" (UID: "5a823511-d878-4e6d-acda-4202e00e3aab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:50 crc kubenswrapper[4685]: I0321 03:49:50.186518 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hncrq" event={"ID":"6a66c652-318d-4fc7-9bff-a73d65b12966","Type":"ContainerStarted","Data":"985d2795150c7b17cd94badd027a081b7e4a6587d4038f77348b1221badc8eff"} Mar 21 03:49:50 crc kubenswrapper[4685]: I0321 03:49:50.206349 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567745-czvxj" event={"ID":"0306bf0e-f0c1-4e47-b63e-909b979c5844","Type":"ContainerStarted","Data":"616371f1f698c4c3153138ce8dc8751f4b5600bcbbaed1da8fcabab8ea80aa1d"} Mar 21 03:49:50 crc kubenswrapper[4685]: I0321 03:49:50.231077 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jjffn" event={"ID":"2efbba77-f5bd-48cb-a790-f4c3564acb75","Type":"ContainerStarted","Data":"d86b2f058fe978366ef98007a16a804bba66a96b08f6c182d7a89d872bb87457"} Mar 21 03:49:50 crc kubenswrapper[4685]: I0321 03:49:50.238016 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"c9c77d7d76433efca51baea1140aa34bc65a29d71e4c9e2f7c36f8853aaf7b2d"} Mar 21 03:49:50 crc kubenswrapper[4685]: I0321 03:49:50.238356 4685 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-fb258 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.31:8443/healthz\": dial tcp 10.217.0.31:8443: connect: connection refused" start-of-body= Mar 21 03:49:50 crc kubenswrapper[4685]: I0321 03:49:50.238407 4685 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fb258" podUID="3a049943-f882-47d7-ac4e-703eebca8103" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.31:8443/healthz\": dial tcp 10.217.0.31:8443: connect: connection refused" Mar 21 03:49:50 crc kubenswrapper[4685]: I0321 03:49:50.238975 4685 patch_prober.go:28] interesting pod/console-operator-58897d9998-djlmv container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.15:8443/readyz\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Mar 21 03:49:50 crc kubenswrapper[4685]: I0321 03:49:50.239010 4685 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-djlmv" podUID="7b429d7c-3700-4d9f-b6b2-554219223515" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.15:8443/readyz\": dial tcp 10.217.0.15:8443: connect: connection refused" Mar 21 03:49:50 crc kubenswrapper[4685]: I0321 03:49:50.242982 4685 patch_prober.go:28] interesting pod/downloads-7954f5f757-clz2m container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.35:8080/\": dial tcp 10.217.0.35:8080: connect: connection refused" start-of-body= Mar 21 03:49:50 crc kubenswrapper[4685]: I0321 03:49:50.243030 4685 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-clz2m" podUID="eb165aaf-36a6-4965-bebf-6a40e1695b94" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.35:8080/\": dial tcp 10.217.0.35:8080: connect: connection refused" Mar 21 03:49:50 crc kubenswrapper[4685]: I0321 03:49:50.264093 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-p5r46" podStartSLOduration=180.264075522 podStartE2EDuration="3m0.264075522s" podCreationTimestamp="2026-03-21 03:46:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 03:49:50.262098052 +0000 UTC m=+222.739166844" watchObservedRunningTime="2026-03-21 03:49:50.264075522 +0000 UTC m=+222.741144314" Mar 21 03:49:50 crc kubenswrapper[4685]: I0321 03:49:50.264356 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dk5lq" podStartSLOduration=180.26435007 podStartE2EDuration="3m0.26435007s" podCreationTimestamp="2026-03-21 03:46:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 03:49:50.184085757 +0000 UTC m=+222.661154559" watchObservedRunningTime="2026-03-21 03:49:50.26435007 +0000 UTC m=+222.741418862" Mar 21 03:49:50 crc kubenswrapper[4685]: I0321 03:49:50.282344 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 03:49:50 crc kubenswrapper[4685]: E0321 03:49:50.283619 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 03:49:50.783602773 +0000 UTC m=+223.260671565 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:50 crc kubenswrapper[4685]: I0321 03:49:50.313448 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-864h4" Mar 21 03:49:50 crc kubenswrapper[4685]: I0321 03:49:50.313954 4685 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-864h4" Mar 21 03:49:50 crc kubenswrapper[4685]: I0321 03:49:50.315984 4685 patch_prober.go:28] interesting pod/router-default-5444994796-vds97 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 03:49:50 crc kubenswrapper[4685]: [-]has-synced failed: reason withheld Mar 21 03:49:50 crc kubenswrapper[4685]: [+]process-running ok Mar 21 03:49:50 crc kubenswrapper[4685]: healthz check failed Mar 21 03:49:50 crc kubenswrapper[4685]: I0321 03:49:50.316040 4685 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vds97" podUID="b13d8427-dcef-4925-92b6-0e6bf1aca8c8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 03:49:50 crc kubenswrapper[4685]: I0321 03:49:50.320993 4685 patch_prober.go:28] interesting pod/apiserver-76f77b778f-864h4 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="Get \"https://10.217.0.5:8443/livez\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Mar 21 03:49:50 crc kubenswrapper[4685]: I0321 03:49:50.321047 4685 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-864h4" podUID="9c1c82f3-080b-47ea-93df-596d79aa2bf8" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.5:8443/livez\": dial tcp 10.217.0.5:8443: connect: connection refused" Mar 21 03:49:50 crc kubenswrapper[4685]: I0321 03:49:50.384140 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:50 crc kubenswrapper[4685]: E0321 03:49:50.387613 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 03:49:50.887592855 +0000 UTC m=+223.364661847 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pjvx8" (UID: "5a823511-d878-4e6d-acda-4202e00e3aab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:50 crc kubenswrapper[4685]: I0321 03:49:50.388820 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-864h4" podStartSLOduration=180.388807612 podStartE2EDuration="3m0.388807612s" podCreationTimestamp="2026-03-21 03:46:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 03:49:50.383312115 +0000 UTC m=+222.860380907" watchObservedRunningTime="2026-03-21 03:49:50.388807612 +0000 UTC m=+222.865876404" Mar 21 03:49:50 crc kubenswrapper[4685]: I0321 03:49:50.390775 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-tg9t2" podStartSLOduration=180.390767671 podStartE2EDuration="3m0.390767671s" podCreationTimestamp="2026-03-21 03:46:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 03:49:50.305728114 +0000 UTC m=+222.782796926" watchObservedRunningTime="2026-03-21 03:49:50.390767671 +0000 UTC m=+222.867836463" Mar 21 03:49:50 crc kubenswrapper[4685]: I0321 03:49:50.407232 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2ptjk" podStartSLOduration=180.407211759 podStartE2EDuration="3m0.407211759s" podCreationTimestamp="2026-03-21 03:46:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 03:49:50.404665982 +0000 UTC m=+222.881734794" watchObservedRunningTime="2026-03-21 03:49:50.407211759 +0000 UTC m=+222.884280561" Mar 21 03:49:50 crc kubenswrapper[4685]: I0321 03:49:50.445185 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-m9xxt" podStartSLOduration=180.44516748 podStartE2EDuration="3m0.44516748s" podCreationTimestamp="2026-03-21 03:46:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 03:49:50.442099717 +0000 UTC m=+222.919168509" watchObservedRunningTime="2026-03-21 03:49:50.44516748 +0000 UTC m=+222.922236272" Mar 21 03:49:50 crc kubenswrapper[4685]: I0321 03:49:50.458698 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-qflxg" podStartSLOduration=180.458679369 podStartE2EDuration="3m0.458679369s" podCreationTimestamp="2026-03-21 03:46:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 03:49:50.457202074 +0000 UTC m=+222.934270866" watchObservedRunningTime="2026-03-21 03:49:50.458679369 +0000 UTC m=+222.935748161" Mar 21 03:49:50 crc kubenswrapper[4685]: I0321 03:49:50.472384 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29567745-czvxj" podStartSLOduration=180.472366894 podStartE2EDuration="3m0.472366894s" podCreationTimestamp="2026-03-21 03:46:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 03:49:50.470659132 +0000 UTC m=+222.947727954" watchObservedRunningTime="2026-03-21 03:49:50.472366894 +0000 UTC m=+222.949435686" Mar 21 03:49:50 crc kubenswrapper[4685]: I0321 03:49:50.485992 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 03:49:50 crc kubenswrapper[4685]: E0321 03:49:50.486130 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 03:49:50.98610833 +0000 UTC m=+223.463177112 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:50 crc kubenswrapper[4685]: I0321 03:49:50.486211 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:50 crc kubenswrapper[4685]: E0321 03:49:50.486497 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 03:49:50.986489452 +0000 UTC m=+223.463558244 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pjvx8" (UID: "5a823511-d878-4e6d-acda-4202e00e3aab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:50 crc kubenswrapper[4685]: I0321 03:49:50.501314 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vktqs" podStartSLOduration=180.501301441 podStartE2EDuration="3m0.501301441s" podCreationTimestamp="2026-03-21 03:46:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 03:49:50.500483576 +0000 UTC m=+222.977552378" watchObservedRunningTime="2026-03-21 03:49:50.501301441 +0000 UTC m=+222.978370233" Mar 21 03:49:50 crc kubenswrapper[4685]: I0321 03:49:50.578700 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jjffn" podStartSLOduration=180.578683666 podStartE2EDuration="3m0.578683666s" podCreationTimestamp="2026-03-21 03:46:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 03:49:50.57817086 +0000 UTC m=+223.055239652" watchObservedRunningTime="2026-03-21 03:49:50.578683666 +0000 UTC m=+223.055752448" Mar 21 03:49:50 crc kubenswrapper[4685]: I0321 03:49:50.586937 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 03:49:50 crc kubenswrapper[4685]: E0321 03:49:50.587324 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 03:49:51.087308917 +0000 UTC m=+223.564377709 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:50 crc kubenswrapper[4685]: I0321 03:49:50.660711 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-646sv" podStartSLOduration=180.660691601 podStartE2EDuration="3m0.660691601s" podCreationTimestamp="2026-03-21 03:46:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 03:49:50.658417312 +0000 UTC m=+223.135486104" watchObservedRunningTime="2026-03-21 03:49:50.660691601 +0000 UTC m=+223.137760393" Mar 21 03:49:50 crc kubenswrapper[4685]: I0321 03:49:50.688950 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:50 crc kubenswrapper[4685]: E0321 03:49:50.689305 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 03:49:51.189292908 +0000 UTC m=+223.666361700 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pjvx8" (UID: "5a823511-d878-4e6d-acda-4202e00e3aab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:50 crc kubenswrapper[4685]: I0321 03:49:50.716573 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-skpjj" podStartSLOduration=180.716538843 podStartE2EDuration="3m0.716538843s" podCreationTimestamp="2026-03-21 03:46:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 03:49:50.68343549 +0000 UTC m=+223.160504282" watchObservedRunningTime="2026-03-21 03:49:50.716538843 +0000 UTC m=+223.193607635" Mar 21 03:49:50 crc kubenswrapper[4685]: I0321 03:49:50.716743 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-97584" podStartSLOduration=180.716735429 podStartE2EDuration="3m0.716735429s" podCreationTimestamp="2026-03-21 03:46:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 03:49:50.713739129 +0000 UTC m=+223.190807921" watchObservedRunningTime="2026-03-21 03:49:50.716735429 +0000 UTC m=+223.193804241" Mar 21 03:49:50 crc kubenswrapper[4685]: I0321 03:49:50.790459 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 03:49:50 crc kubenswrapper[4685]: E0321 03:49:50.790680 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 03:49:51.29065242 +0000 UTC m=+223.767721212 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:50 crc kubenswrapper[4685]: I0321 03:49:50.790754 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:50 crc kubenswrapper[4685]: E0321 03:49:50.791082 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 03:49:51.291069022 +0000 UTC m=+223.768137814 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pjvx8" (UID: "5a823511-d878-4e6d-acda-4202e00e3aab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:50 crc kubenswrapper[4685]: I0321 03:49:50.892404 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 03:49:50 crc kubenswrapper[4685]: E0321 03:49:50.892762 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 03:49:51.392736793 +0000 UTC m=+223.869805585 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:50 crc kubenswrapper[4685]: I0321 03:49:50.993771 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:50 crc kubenswrapper[4685]: E0321 03:49:50.994057 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 03:49:51.494046223 +0000 UTC m=+223.971115015 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pjvx8" (UID: "5a823511-d878-4e6d-acda-4202e00e3aab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:51 crc kubenswrapper[4685]: I0321 03:49:51.094627 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 03:49:51 crc kubenswrapper[4685]: E0321 03:49:51.095017 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 03:49:51.595001253 +0000 UTC m=+224.072070045 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:51 crc kubenswrapper[4685]: I0321 03:49:51.195967 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:51 crc kubenswrapper[4685]: E0321 03:49:51.196310 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 03:49:51.696296073 +0000 UTC m=+224.173364855 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pjvx8" (UID: "5a823511-d878-4e6d-acda-4202e00e3aab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:51 crc kubenswrapper[4685]: I0321 03:49:51.254612 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-52mpk" event={"ID":"d841e9a0-5181-40b8-9374-daa38341c4ff","Type":"ContainerStarted","Data":"1efec1438689621ba08365e333c817becfa221e9519d350d1c6cf0f05ddda7ed"} Mar 21 03:49:51 crc kubenswrapper[4685]: I0321 03:49:51.263062 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2jl6z" event={"ID":"ce5ae64a-c99b-4700-aef6-23a77794a308","Type":"ContainerStarted","Data":"372b7b3a08c28311d633c9d591330485aca8437317e95a70fe9e27a25557fe20"} Mar 21 03:49:51 crc kubenswrapper[4685]: I0321 03:49:51.265335 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hncrq" event={"ID":"6a66c652-318d-4fc7-9bff-a73d65b12966","Type":"ContainerStarted","Data":"703a6013b9e4c48a0e489620ac392ea620aff0141b5b712954f7f730358665d5"} Mar 21 03:49:51 crc kubenswrapper[4685]: I0321 03:49:51.265501 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-hncrq" Mar 21 03:49:51 crc kubenswrapper[4685]: I0321 03:49:51.266731 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-v9rdl" event={"ID":"fda9b1ff-e4a8-4d15-8f7b-2974991cd252","Type":"ContainerStarted","Data":"c86092808cb98b37b639c2c44863a9402599548881f1d536bff3dabd8cb81be4"} Mar 21 03:49:51 crc kubenswrapper[4685]: I0321 03:49:51.268804 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2tvkz" event={"ID":"844b8e4f-7ba3-4a52-b6a7-0e8ea0d78003","Type":"ContainerStarted","Data":"aa3b8380db1624a99d61a6c04df8bddbbdaabbe306c791d151bccdda0864757e"} Mar 21 03:49:51 crc kubenswrapper[4685]: I0321 03:49:51.277580 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"e6f3f0499db92b04adaf8561acc64db6c03adad6a4bc96ac042af4fd332d3600"} Mar 21 03:49:51 crc kubenswrapper[4685]: I0321 03:49:51.290676 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5bwhq" event={"ID":"081c0a2d-7b34-487c-b965-e29e9231ef7b","Type":"ContainerStarted","Data":"7094a1f13d5ab15979a1c7ede3b3d236440bf4b726fb7a97a5cb05ebfa9064b4"} Mar 21 03:49:51 crc kubenswrapper[4685]: I0321 03:49:51.296625 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 03:49:51 crc kubenswrapper[4685]: E0321 03:49:51.297067 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 03:49:51.797048946 +0000 UTC m=+224.274117738 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:51 crc kubenswrapper[4685]: I0321 03:49:51.307686 4685 patch_prober.go:28] interesting pod/router-default-5444994796-vds97 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 03:49:51 crc kubenswrapper[4685]: [-]has-synced failed: reason withheld Mar 21 03:49:51 crc kubenswrapper[4685]: [+]process-running ok Mar 21 03:49:51 crc kubenswrapper[4685]: healthz check failed Mar 21 03:49:51 crc kubenswrapper[4685]: I0321 03:49:51.307740 4685 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vds97" podUID="b13d8427-dcef-4925-92b6-0e6bf1aca8c8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 03:49:51 crc kubenswrapper[4685]: I0321 03:49:51.312817 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-52mpk" podStartSLOduration=181.312796153 podStartE2EDuration="3m1.312796153s" podCreationTimestamp="2026-03-21 03:46:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 03:49:51.27737253 +0000 UTC m=+223.754441322" watchObservedRunningTime="2026-03-21 03:49:51.312796153 +0000 UTC m=+223.789864945" Mar 21 03:49:51 crc kubenswrapper[4685]: I0321 03:49:51.315265 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l8n7r" event={"ID":"5a3e73bc-7747-47d5-958b-1adc974f20a9","Type":"ContainerStarted","Data":"a120138146311bb0fbff1915bb15752bf50932e712eb5f559056ee5eefe2da7c"} Mar 21 03:49:51 crc kubenswrapper[4685]: I0321 03:49:51.325643 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-n8ttn" event={"ID":"c2d03fe9-f23e-4f3d-a281-b80ba7fa7f38","Type":"ContainerStarted","Data":"1c1b5a60030d487def08448b703117af0a0ff199c17eb80f123d975dda86dcb5"} Mar 21 03:49:51 crc kubenswrapper[4685]: I0321 03:49:51.325799 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-n8ttn" Mar 21 03:49:51 crc kubenswrapper[4685]: I0321 03:49:51.341334 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tb9dl" event={"ID":"ca5cb48c-43cc-428f-bf99-ba396d595e5c","Type":"ContainerStarted","Data":"ea408dd4224b3139cb906ede7c544d620866855309662580c769de10ad0d58de"} Mar 21 03:49:51 crc kubenswrapper[4685]: I0321 03:49:51.369789 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-hncrq" podStartSLOduration=8.36977227 podStartE2EDuration="8.36977227s" podCreationTimestamp="2026-03-21 03:49:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 03:49:51.367546252 +0000 UTC m=+223.844615064" watchObservedRunningTime="2026-03-21 03:49:51.36977227 +0000 UTC m=+223.846841062" Mar 21 03:49:51 crc kubenswrapper[4685]: I0321 03:49:51.398480 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:51 crc kubenswrapper[4685]: E0321 03:49:51.399609 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 03:49:51.899592894 +0000 UTC m=+224.376661686 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pjvx8" (UID: "5a823511-d878-4e6d-acda-4202e00e3aab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:51 crc kubenswrapper[4685]: I0321 03:49:51.428274 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-n8ttn" podStartSLOduration=181.428255882 podStartE2EDuration="3m1.428255882s" podCreationTimestamp="2026-03-21 03:46:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 03:49:51.397398967 +0000 UTC m=+223.874467759" watchObservedRunningTime="2026-03-21 03:49:51.428255882 +0000 UTC m=+223.905324674" Mar 21 03:49:51 crc kubenswrapper[4685]: I0321 03:49:51.431006 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l8n7r" podStartSLOduration=181.430986265 podStartE2EDuration="3m1.430986265s" podCreationTimestamp="2026-03-21 03:46:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 03:49:51.427797888 +0000 UTC m=+223.904866690" watchObservedRunningTime="2026-03-21 03:49:51.430986265 +0000 UTC m=+223.908055057" Mar 21 03:49:51 crc kubenswrapper[4685]: I0321 03:49:51.434466 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-n9mn4" event={"ID":"38f42eb2-0cf6-4c5b-8159-a89e22404a73","Type":"ContainerStarted","Data":"f9ca16becdbe0bccce53f548e43f70df30a7fe70c21570c603a767567f8b30a3"} Mar 21 03:49:51 crc kubenswrapper[4685]: I0321 03:49:51.449895 4685 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-mlpft container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Mar 21 03:49:51 crc kubenswrapper[4685]: I0321 03:49:51.449953 4685 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mlpft" podUID="ea531307-d9b1-4798-bac8-d34094d27e2c" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" Mar 21 03:49:51 crc kubenswrapper[4685]: I0321 03:49:51.453279 4685 patch_prober.go:28] interesting pod/downloads-7954f5f757-clz2m container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.35:8080/\": dial tcp 10.217.0.35:8080: connect: connection refused" start-of-body= Mar 21 03:49:51 crc kubenswrapper[4685]: I0321 03:49:51.453358 4685 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-clz2m" podUID="eb165aaf-36a6-4965-bebf-6a40e1695b94" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.35:8080/\": dial tcp 10.217.0.35:8080: connect: connection refused" Mar 21 03:49:51 crc kubenswrapper[4685]: I0321 03:49:51.453453 4685 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-pcwhp container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.34:8080/healthz\": dial tcp 10.217.0.34:8080: connect: connection refused" start-of-body= Mar 21 03:49:51 crc kubenswrapper[4685]: I0321 03:49:51.453482 4685 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-pcwhp" podUID="b0b74168-914c-4a2e-9122-c55d3bc3bcc2" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.34:8080/healthz\": dial tcp 10.217.0.34:8080: connect: connection refused" Mar 21 03:49:51 crc kubenswrapper[4685]: I0321 03:49:51.498103 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-n9mn4" podStartSLOduration=181.498079788 podStartE2EDuration="3m1.498079788s" podCreationTimestamp="2026-03-21 03:46:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 03:49:51.495017845 +0000 UTC m=+223.972086667" watchObservedRunningTime="2026-03-21 03:49:51.498079788 +0000 UTC m=+223.975148580" Mar 21 03:49:51 crc kubenswrapper[4685]: I0321 03:49:51.499234 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 03:49:51 crc kubenswrapper[4685]: E0321 03:49:51.503655 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 03:49:52.003633597 +0000 UTC m=+224.480702379 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:51 crc kubenswrapper[4685]: I0321 03:49:51.601765 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:51 crc kubenswrapper[4685]: E0321 03:49:51.602403 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 03:49:52.102390379 +0000 UTC m=+224.579459171 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pjvx8" (UID: "5a823511-d878-4e6d-acda-4202e00e3aab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:51 crc kubenswrapper[4685]: I0321 03:49:51.702621 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 03:49:51 crc kubenswrapper[4685]: E0321 03:49:51.702793 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 03:49:52.202759551 +0000 UTC m=+224.679828343 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:51 crc kubenswrapper[4685]: I0321 03:49:51.702903 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:51 crc kubenswrapper[4685]: E0321 03:49:51.703234 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 03:49:52.203225425 +0000 UTC m=+224.680294217 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pjvx8" (UID: "5a823511-d878-4e6d-acda-4202e00e3aab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:51 crc kubenswrapper[4685]: I0321 03:49:51.804005 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 03:49:51 crc kubenswrapper[4685]: E0321 03:49:51.804212 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 03:49:52.304186965 +0000 UTC m=+224.781255757 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:51 crc kubenswrapper[4685]: I0321 03:49:51.804436 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:51 crc kubenswrapper[4685]: E0321 03:49:51.804758 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 03:49:52.304751242 +0000 UTC m=+224.781820034 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pjvx8" (UID: "5a823511-d878-4e6d-acda-4202e00e3aab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:51 crc kubenswrapper[4685]: I0321 03:49:51.905357 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 03:49:51 crc kubenswrapper[4685]: E0321 03:49:51.905541 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 03:49:52.405507915 +0000 UTC m=+224.882576707 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:51 crc kubenswrapper[4685]: I0321 03:49:51.905975 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:51 crc kubenswrapper[4685]: E0321 03:49:51.906385 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 03:49:52.406371002 +0000 UTC m=+224.883439794 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pjvx8" (UID: "5a823511-d878-4e6d-acda-4202e00e3aab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:52 crc kubenswrapper[4685]: I0321 03:49:52.006703 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 03:49:52 crc kubenswrapper[4685]: E0321 03:49:52.007342 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 03:49:52.507146276 +0000 UTC m=+224.984215098 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:52 crc kubenswrapper[4685]: I0321 03:49:52.108286 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:52 crc kubenswrapper[4685]: E0321 03:49:52.108595 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 03:49:52.60858041 +0000 UTC m=+225.085649202 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pjvx8" (UID: "5a823511-d878-4e6d-acda-4202e00e3aab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:52 crc kubenswrapper[4685]: I0321 03:49:52.165586 4685 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-2ptjk container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Mar 21 03:49:52 crc kubenswrapper[4685]: I0321 03:49:52.165639 4685 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2ptjk" podUID="8f518bb5-0336-44df-ac60-174c8426974e" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" Mar 21 03:49:52 crc kubenswrapper[4685]: I0321 03:49:52.165825 4685 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-2ptjk container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Mar 21 03:49:52 crc kubenswrapper[4685]: I0321 03:49:52.165893 4685 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2ptjk" podUID="8f518bb5-0336-44df-ac60-174c8426974e" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" Mar 21 03:49:52 crc kubenswrapper[4685]: I0321 03:49:52.208828 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 03:49:52 crc kubenswrapper[4685]: E0321 03:49:52.208983 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 03:49:52.708954151 +0000 UTC m=+225.186022943 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:52 crc kubenswrapper[4685]: I0321 03:49:52.209041 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:52 crc kubenswrapper[4685]: E0321 03:49:52.209331 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 03:49:52.709318402 +0000 UTC m=+225.186387194 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pjvx8" (UID: "5a823511-d878-4e6d-acda-4202e00e3aab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:52 crc kubenswrapper[4685]: I0321 03:49:52.304004 4685 patch_prober.go:28] interesting pod/router-default-5444994796-vds97 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 03:49:52 crc kubenswrapper[4685]: [-]has-synced failed: reason withheld Mar 21 03:49:52 crc kubenswrapper[4685]: [+]process-running ok Mar 21 03:49:52 crc kubenswrapper[4685]: healthz check failed Mar 21 03:49:52 crc kubenswrapper[4685]: I0321 03:49:52.304073 4685 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vds97" podUID="b13d8427-dcef-4925-92b6-0e6bf1aca8c8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 03:49:52 crc kubenswrapper[4685]: I0321 03:49:52.311413 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 03:49:52 crc kubenswrapper[4685]: E0321 03:49:52.311569 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 03:49:52.81154942 +0000 UTC m=+225.288618212 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:52 crc kubenswrapper[4685]: I0321 03:49:52.311809 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:52 crc kubenswrapper[4685]: E0321 03:49:52.312095 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 03:49:52.812088006 +0000 UTC m=+225.289156798 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pjvx8" (UID: "5a823511-d878-4e6d-acda-4202e00e3aab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:52 crc kubenswrapper[4685]: I0321 03:49:52.413308 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 03:49:52 crc kubenswrapper[4685]: E0321 03:49:52.413443 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 03:49:52.913423717 +0000 UTC m=+225.390492509 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:52 crc kubenswrapper[4685]: I0321 03:49:52.413510 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:52 crc kubenswrapper[4685]: E0321 03:49:52.413799 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 03:49:52.913788468 +0000 UTC m=+225.390857250 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pjvx8" (UID: "5a823511-d878-4e6d-acda-4202e00e3aab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:52 crc kubenswrapper[4685]: I0321 03:49:52.439937 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-2pgbl" event={"ID":"b695978e-b67c-4812-9083-22538cdd3045","Type":"ContainerStarted","Data":"cba715c1dcff29640b7b9404f9c50c3915eb002b02aee3985372f8071ada7a5f"} Mar 21 03:49:52 crc kubenswrapper[4685]: I0321 03:49:52.448054 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-v9rdl" event={"ID":"fda9b1ff-e4a8-4d15-8f7b-2974991cd252","Type":"ContainerStarted","Data":"03449ffe015b7181f98b8ad81780433e6807afa33bb31687cf7506147918db6b"} Mar 21 03:49:52 crc kubenswrapper[4685]: I0321 03:49:52.452073 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mlpft" Mar 21 03:49:52 crc kubenswrapper[4685]: I0321 03:49:52.509636 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tb9dl" podStartSLOduration=182.509616942 podStartE2EDuration="3m2.509616942s" podCreationTimestamp="2026-03-21 03:46:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 03:49:52.50956074 +0000 UTC m=+224.986629532" watchObservedRunningTime="2026-03-21 03:49:52.509616942 +0000 UTC m=+224.986685734" Mar 21 03:49:52 crc kubenswrapper[4685]: I0321 03:49:52.510848 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2tvkz" podStartSLOduration=182.510828439 podStartE2EDuration="3m2.510828439s" podCreationTimestamp="2026-03-21 03:46:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 03:49:52.472409774 +0000 UTC m=+224.949478566" watchObservedRunningTime="2026-03-21 03:49:52.510828439 +0000 UTC m=+224.987897231" Mar 21 03:49:52 crc kubenswrapper[4685]: I0321 03:49:52.514185 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 03:49:52 crc kubenswrapper[4685]: E0321 03:49:52.514382 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 03:49:53.014352986 +0000 UTC m=+225.491421778 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:52 crc kubenswrapper[4685]: I0321 03:49:52.514438 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:52 crc kubenswrapper[4685]: E0321 03:49:52.516397 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 03:49:53.016384087 +0000 UTC m=+225.493452879 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pjvx8" (UID: "5a823511-d878-4e6d-acda-4202e00e3aab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:52 crc kubenswrapper[4685]: I0321 03:49:52.532490 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2jl6z" podStartSLOduration=182.532471905 podStartE2EDuration="3m2.532471905s" podCreationTimestamp="2026-03-21 03:46:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 03:49:52.529807444 +0000 UTC m=+225.006876236" watchObservedRunningTime="2026-03-21 03:49:52.532471905 +0000 UTC m=+225.009540697" Mar 21 03:49:52 crc kubenswrapper[4685]: I0321 03:49:52.548746 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-v9rdl" podStartSLOduration=182.548729847 podStartE2EDuration="3m2.548729847s" podCreationTimestamp="2026-03-21 03:46:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 03:49:52.547130699 +0000 UTC m=+225.024199491" watchObservedRunningTime="2026-03-21 03:49:52.548729847 +0000 UTC m=+225.025798639" Mar 21 03:49:52 crc kubenswrapper[4685]: I0321 03:49:52.566372 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5bwhq" podStartSLOduration=182.566356122 podStartE2EDuration="3m2.566356122s" podCreationTimestamp="2026-03-21 03:46:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 03:49:52.5656463 +0000 UTC m=+225.042715092" watchObservedRunningTime="2026-03-21 03:49:52.566356122 +0000 UTC m=+225.043424914" Mar 21 03:49:52 crc kubenswrapper[4685]: I0321 03:49:52.616501 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 03:49:52 crc kubenswrapper[4685]: E0321 03:49:52.616663 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 03:49:53.116641495 +0000 UTC m=+225.593710287 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:52 crc kubenswrapper[4685]: I0321 03:49:52.616868 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:52 crc kubenswrapper[4685]: E0321 03:49:52.617207 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 03:49:53.117196742 +0000 UTC m=+225.594265534 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pjvx8" (UID: "5a823511-d878-4e6d-acda-4202e00e3aab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:52 crc kubenswrapper[4685]: I0321 03:49:52.717637 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 03:49:52 crc kubenswrapper[4685]: E0321 03:49:52.717807 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 03:49:53.217781691 +0000 UTC m=+225.694850483 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:52 crc kubenswrapper[4685]: I0321 03:49:52.717880 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:52 crc kubenswrapper[4685]: E0321 03:49:52.718189 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 03:49:53.218180373 +0000 UTC m=+225.695249165 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pjvx8" (UID: "5a823511-d878-4e6d-acda-4202e00e3aab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:52 crc kubenswrapper[4685]: I0321 03:49:52.776945 4685 ???:1] "http: TLS handshake error from 192.168.126.11:35866: no serving certificate available for the kubelet" Mar 21 03:49:52 crc kubenswrapper[4685]: I0321 03:49:52.819408 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 03:49:52 crc kubenswrapper[4685]: E0321 03:49:52.819612 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 03:49:53.319582466 +0000 UTC m=+225.796651268 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:52 crc kubenswrapper[4685]: I0321 03:49:52.819665 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:52 crc kubenswrapper[4685]: E0321 03:49:52.819945 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 03:49:53.319931326 +0000 UTC m=+225.797000118 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pjvx8" (UID: "5a823511-d878-4e6d-acda-4202e00e3aab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:52 crc kubenswrapper[4685]: I0321 03:49:52.905906 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-mkjd2"] Mar 21 03:49:52 crc kubenswrapper[4685]: I0321 03:49:52.906179 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-mkjd2" podUID="cd1c8c06-710c-401b-803e-9cc18aa1b4b6" containerName="controller-manager" containerID="cri-o://74e90300e77c4a10664ef3e64ba0c75e3b8d29360bcbddcadac288be6716b6ba" gracePeriod=30 Mar 21 03:49:52 crc kubenswrapper[4685]: I0321 03:49:52.926049 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 03:49:52 crc kubenswrapper[4685]: E0321 03:49:52.926572 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 03:49:53.426552157 +0000 UTC m=+225.903620949 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:52 crc kubenswrapper[4685]: I0321 03:49:52.927471 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-mkjd2" Mar 21 03:49:52 crc kubenswrapper[4685]: I0321 03:49:52.971757 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-plggv"] Mar 21 03:49:52 crc kubenswrapper[4685]: I0321 03:49:52.971969 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-plggv" podUID="588eb87c-d2c0-45fb-a0f7-33de36d5d745" containerName="route-controller-manager" containerID="cri-o://1c65935116820cb8f53969211726b48113307e8d74c8fca574cb4bca5111d41d" gracePeriod=30 Mar 21 03:49:53 crc kubenswrapper[4685]: I0321 03:49:53.029431 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:53 crc kubenswrapper[4685]: E0321 03:49:53.029808 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 03:49:53.529796156 +0000 UTC m=+226.006864948 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pjvx8" (UID: "5a823511-d878-4e6d-acda-4202e00e3aab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:53 crc kubenswrapper[4685]: I0321 03:49:53.131018 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 03:49:53 crc kubenswrapper[4685]: E0321 03:49:53.131218 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 03:49:53.631188849 +0000 UTC m=+226.108257641 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:53 crc kubenswrapper[4685]: I0321 03:49:53.131312 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:53 crc kubenswrapper[4685]: E0321 03:49:53.131813 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 03:49:53.631803728 +0000 UTC m=+226.108872520 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pjvx8" (UID: "5a823511-d878-4e6d-acda-4202e00e3aab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:53 crc kubenswrapper[4685]: I0321 03:49:53.232430 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 03:49:53 crc kubenswrapper[4685]: E0321 03:49:53.232613 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 03:49:53.732584362 +0000 UTC m=+226.209653154 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:53 crc kubenswrapper[4685]: I0321 03:49:53.232726 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:53 crc kubenswrapper[4685]: E0321 03:49:53.233115 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 03:49:53.733108358 +0000 UTC m=+226.210177150 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pjvx8" (UID: "5a823511-d878-4e6d-acda-4202e00e3aab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:53 crc kubenswrapper[4685]: I0321 03:49:53.305230 4685 patch_prober.go:28] interesting pod/router-default-5444994796-vds97 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 03:49:53 crc kubenswrapper[4685]: [-]has-synced failed: reason withheld Mar 21 03:49:53 crc kubenswrapper[4685]: [+]process-running ok Mar 21 03:49:53 crc kubenswrapper[4685]: healthz check failed Mar 21 03:49:53 crc kubenswrapper[4685]: I0321 03:49:53.305584 4685 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vds97" podUID="b13d8427-dcef-4925-92b6-0e6bf1aca8c8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 03:49:53 crc kubenswrapper[4685]: I0321 03:49:53.333728 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 03:49:53 crc kubenswrapper[4685]: E0321 03:49:53.333922 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 03:49:53.833897102 +0000 UTC m=+226.310965894 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:53 crc kubenswrapper[4685]: I0321 03:49:53.334023 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:53 crc kubenswrapper[4685]: E0321 03:49:53.334330 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 03:49:53.834317125 +0000 UTC m=+226.311385917 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pjvx8" (UID: "5a823511-d878-4e6d-acda-4202e00e3aab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:53 crc kubenswrapper[4685]: I0321 03:49:53.434947 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 03:49:53 crc kubenswrapper[4685]: E0321 03:49:53.435091 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 03:49:53.935074808 +0000 UTC m=+226.412143600 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:53 crc kubenswrapper[4685]: I0321 03:49:53.435140 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:53 crc kubenswrapper[4685]: E0321 03:49:53.435393 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 03:49:53.935384838 +0000 UTC m=+226.412453630 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pjvx8" (UID: "5a823511-d878-4e6d-acda-4202e00e3aab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:53 crc kubenswrapper[4685]: I0321 03:49:53.454905 4685 generic.go:334] "Generic (PLEG): container finished" podID="588eb87c-d2c0-45fb-a0f7-33de36d5d745" containerID="1c65935116820cb8f53969211726b48113307e8d74c8fca574cb4bca5111d41d" exitCode=0 Mar 21 03:49:53 crc kubenswrapper[4685]: I0321 03:49:53.454963 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-plggv" event={"ID":"588eb87c-d2c0-45fb-a0f7-33de36d5d745","Type":"ContainerDied","Data":"1c65935116820cb8f53969211726b48113307e8d74c8fca574cb4bca5111d41d"} Mar 21 03:49:53 crc kubenswrapper[4685]: I0321 03:49:53.464170 4685 generic.go:334] "Generic (PLEG): container finished" podID="cd1c8c06-710c-401b-803e-9cc18aa1b4b6" containerID="74e90300e77c4a10664ef3e64ba0c75e3b8d29360bcbddcadac288be6716b6ba" exitCode=0 Mar 21 03:49:53 crc kubenswrapper[4685]: I0321 03:49:53.464254 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-mkjd2" event={"ID":"cd1c8c06-710c-401b-803e-9cc18aa1b4b6","Type":"ContainerDied","Data":"74e90300e77c4a10664ef3e64ba0c75e3b8d29360bcbddcadac288be6716b6ba"} Mar 21 03:49:53 crc kubenswrapper[4685]: I0321 03:49:53.536529 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 03:49:53 crc kubenswrapper[4685]: E0321 03:49:53.536722 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 03:49:54.036695698 +0000 UTC m=+226.513764490 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:53 crc kubenswrapper[4685]: I0321 03:49:53.537029 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:53 crc kubenswrapper[4685]: E0321 03:49:53.537372 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 03:49:54.037360908 +0000 UTC m=+226.514429900 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pjvx8" (UID: "5a823511-d878-4e6d-acda-4202e00e3aab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:53 crc kubenswrapper[4685]: I0321 03:49:53.638099 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 03:49:53 crc kubenswrapper[4685]: E0321 03:49:53.638374 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 03:49:54.138338588 +0000 UTC m=+226.615407380 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:53 crc kubenswrapper[4685]: I0321 03:49:53.638448 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:53 crc kubenswrapper[4685]: E0321 03:49:53.638750 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 03:49:54.13873747 +0000 UTC m=+226.615806262 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pjvx8" (UID: "5a823511-d878-4e6d-acda-4202e00e3aab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:53 crc kubenswrapper[4685]: I0321 03:49:53.723276 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mhc9s"] Mar 21 03:49:53 crc kubenswrapper[4685]: I0321 03:49:53.724275 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mhc9s" Mar 21 03:49:53 crc kubenswrapper[4685]: I0321 03:49:53.726643 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 21 03:49:53 crc kubenswrapper[4685]: I0321 03:49:53.739289 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 03:49:53 crc kubenswrapper[4685]: E0321 03:49:53.739520 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 03:49:54.239504724 +0000 UTC m=+226.716573516 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:53 crc kubenswrapper[4685]: I0321 03:49:53.742081 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mhc9s"] Mar 21 03:49:53 crc kubenswrapper[4685]: I0321 03:49:53.840980 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtx65\" (UniqueName: \"kubernetes.io/projected/931ed0e7-7ffb-48ba-92b0-28883a6f0b39-kube-api-access-jtx65\") pod \"community-operators-mhc9s\" (UID: \"931ed0e7-7ffb-48ba-92b0-28883a6f0b39\") " pod="openshift-marketplace/community-operators-mhc9s" Mar 21 03:49:53 crc kubenswrapper[4685]: I0321 03:49:53.841016 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/931ed0e7-7ffb-48ba-92b0-28883a6f0b39-catalog-content\") pod \"community-operators-mhc9s\" (UID: \"931ed0e7-7ffb-48ba-92b0-28883a6f0b39\") " pod="openshift-marketplace/community-operators-mhc9s" Mar 21 03:49:53 crc kubenswrapper[4685]: I0321 03:49:53.841070 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:53 crc kubenswrapper[4685]: I0321 03:49:53.841221 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/931ed0e7-7ffb-48ba-92b0-28883a6f0b39-utilities\") pod \"community-operators-mhc9s\" (UID: \"931ed0e7-7ffb-48ba-92b0-28883a6f0b39\") " pod="openshift-marketplace/community-operators-mhc9s" Mar 21 03:49:53 crc kubenswrapper[4685]: E0321 03:49:53.841318 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 03:49:54.341306259 +0000 UTC m=+226.818375051 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pjvx8" (UID: "5a823511-d878-4e6d-acda-4202e00e3aab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:53 crc kubenswrapper[4685]: I0321 03:49:53.920825 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xvjs7"] Mar 21 03:49:53 crc kubenswrapper[4685]: I0321 03:49:53.922045 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xvjs7" Mar 21 03:49:53 crc kubenswrapper[4685]: I0321 03:49:53.926229 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 21 03:49:53 crc kubenswrapper[4685]: I0321 03:49:53.939116 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xvjs7"] Mar 21 03:49:53 crc kubenswrapper[4685]: I0321 03:49:53.943794 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 03:49:53 crc kubenswrapper[4685]: E0321 03:49:53.943974 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 03:49:54.44394662 +0000 UTC m=+226.921015432 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:53 crc kubenswrapper[4685]: I0321 03:49:53.944056 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:53 crc kubenswrapper[4685]: I0321 03:49:53.944108 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/931ed0e7-7ffb-48ba-92b0-28883a6f0b39-utilities\") pod \"community-operators-mhc9s\" (UID: \"931ed0e7-7ffb-48ba-92b0-28883a6f0b39\") " pod="openshift-marketplace/community-operators-mhc9s" Mar 21 03:49:53 crc kubenswrapper[4685]: I0321 03:49:53.944151 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtx65\" (UniqueName: \"kubernetes.io/projected/931ed0e7-7ffb-48ba-92b0-28883a6f0b39-kube-api-access-jtx65\") pod \"community-operators-mhc9s\" (UID: \"931ed0e7-7ffb-48ba-92b0-28883a6f0b39\") " pod="openshift-marketplace/community-operators-mhc9s" Mar 21 03:49:53 crc kubenswrapper[4685]: I0321 03:49:53.944168 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/931ed0e7-7ffb-48ba-92b0-28883a6f0b39-catalog-content\") pod \"community-operators-mhc9s\" (UID: \"931ed0e7-7ffb-48ba-92b0-28883a6f0b39\") " pod="openshift-marketplace/community-operators-mhc9s" Mar 21 03:49:53 crc kubenswrapper[4685]: I0321 03:49:53.944553 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/931ed0e7-7ffb-48ba-92b0-28883a6f0b39-catalog-content\") pod \"community-operators-mhc9s\" (UID: \"931ed0e7-7ffb-48ba-92b0-28883a6f0b39\") " pod="openshift-marketplace/community-operators-mhc9s" Mar 21 03:49:53 crc kubenswrapper[4685]: E0321 03:49:53.944804 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 03:49:54.444793915 +0000 UTC m=+226.921862707 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pjvx8" (UID: "5a823511-d878-4e6d-acda-4202e00e3aab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:53 crc kubenswrapper[4685]: I0321 03:49:53.945155 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/931ed0e7-7ffb-48ba-92b0-28883a6f0b39-utilities\") pod \"community-operators-mhc9s\" (UID: \"931ed0e7-7ffb-48ba-92b0-28883a6f0b39\") " pod="openshift-marketplace/community-operators-mhc9s" Mar 21 03:49:53 crc kubenswrapper[4685]: I0321 03:49:53.980978 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtx65\" (UniqueName: \"kubernetes.io/projected/931ed0e7-7ffb-48ba-92b0-28883a6f0b39-kube-api-access-jtx65\") pod \"community-operators-mhc9s\" (UID: \"931ed0e7-7ffb-48ba-92b0-28883a6f0b39\") " pod="openshift-marketplace/community-operators-mhc9s" Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.040167 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mhc9s" Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.045338 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 03:49:54 crc kubenswrapper[4685]: E0321 03:49:54.045483 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 03:49:54.545458626 +0000 UTC m=+227.022527418 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.045650 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53ad7ff2-a7d3-4ad6-8e97-14542e4a99cd-catalog-content\") pod \"certified-operators-xvjs7\" (UID: \"53ad7ff2-a7d3-4ad6-8e97-14542e4a99cd\") " pod="openshift-marketplace/certified-operators-xvjs7" Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.045695 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53ad7ff2-a7d3-4ad6-8e97-14542e4a99cd-utilities\") pod \"certified-operators-xvjs7\" (UID: \"53ad7ff2-a7d3-4ad6-8e97-14542e4a99cd\") " pod="openshift-marketplace/certified-operators-xvjs7" Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.045785 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.045847 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27mn8\" (UniqueName: \"kubernetes.io/projected/53ad7ff2-a7d3-4ad6-8e97-14542e4a99cd-kube-api-access-27mn8\") pod \"certified-operators-xvjs7\" (UID: \"53ad7ff2-a7d3-4ad6-8e97-14542e4a99cd\") " pod="openshift-marketplace/certified-operators-xvjs7" Mar 21 03:49:54 crc kubenswrapper[4685]: E0321 03:49:54.046221 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 03:49:54.546202119 +0000 UTC m=+227.023271121 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pjvx8" (UID: "5a823511-d878-4e6d-acda-4202e00e3aab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.130308 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7swf2"] Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.133586 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7swf2" Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.148367 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.148627 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27mn8\" (UniqueName: \"kubernetes.io/projected/53ad7ff2-a7d3-4ad6-8e97-14542e4a99cd-kube-api-access-27mn8\") pod \"certified-operators-xvjs7\" (UID: \"53ad7ff2-a7d3-4ad6-8e97-14542e4a99cd\") " pod="openshift-marketplace/certified-operators-xvjs7" Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.148693 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53ad7ff2-a7d3-4ad6-8e97-14542e4a99cd-catalog-content\") pod \"certified-operators-xvjs7\" (UID: \"53ad7ff2-a7d3-4ad6-8e97-14542e4a99cd\") " pod="openshift-marketplace/certified-operators-xvjs7" Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.148729 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53ad7ff2-a7d3-4ad6-8e97-14542e4a99cd-utilities\") pod \"certified-operators-xvjs7\" (UID: \"53ad7ff2-a7d3-4ad6-8e97-14542e4a99cd\") " pod="openshift-marketplace/certified-operators-xvjs7" Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.149237 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53ad7ff2-a7d3-4ad6-8e97-14542e4a99cd-utilities\") pod \"certified-operators-xvjs7\" (UID: \"53ad7ff2-a7d3-4ad6-8e97-14542e4a99cd\") " pod="openshift-marketplace/certified-operators-xvjs7" Mar 21 03:49:54 crc kubenswrapper[4685]: E0321 03:49:54.149327 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 03:49:54.649308853 +0000 UTC m=+227.126377645 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.149867 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53ad7ff2-a7d3-4ad6-8e97-14542e4a99cd-catalog-content\") pod \"certified-operators-xvjs7\" (UID: \"53ad7ff2-a7d3-4ad6-8e97-14542e4a99cd\") " pod="openshift-marketplace/certified-operators-xvjs7" Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.152926 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7swf2"] Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.176361 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27mn8\" (UniqueName: \"kubernetes.io/projected/53ad7ff2-a7d3-4ad6-8e97-14542e4a99cd-kube-api-access-27mn8\") pod \"certified-operators-xvjs7\" (UID: \"53ad7ff2-a7d3-4ad6-8e97-14542e4a99cd\") " pod="openshift-marketplace/certified-operators-xvjs7" Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.237293 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xvjs7" Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.251755 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4ebdf18-8426-42cc-93a6-60b46261aebe-catalog-content\") pod \"community-operators-7swf2\" (UID: \"d4ebdf18-8426-42cc-93a6-60b46261aebe\") " pod="openshift-marketplace/community-operators-7swf2" Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.251846 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zktx\" (UniqueName: \"kubernetes.io/projected/d4ebdf18-8426-42cc-93a6-60b46261aebe-kube-api-access-5zktx\") pod \"community-operators-7swf2\" (UID: \"d4ebdf18-8426-42cc-93a6-60b46261aebe\") " pod="openshift-marketplace/community-operators-7swf2" Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.251875 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4ebdf18-8426-42cc-93a6-60b46261aebe-utilities\") pod \"community-operators-7swf2\" (UID: \"d4ebdf18-8426-42cc-93a6-60b46261aebe\") " pod="openshift-marketplace/community-operators-7swf2" Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.251920 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:54 crc kubenswrapper[4685]: E0321 03:49:54.252175 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 03:49:54.75216388 +0000 UTC m=+227.229232672 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pjvx8" (UID: "5a823511-d878-4e6d-acda-4202e00e3aab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.304311 4685 patch_prober.go:28] interesting pod/router-default-5444994796-vds97 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 03:49:54 crc kubenswrapper[4685]: [-]has-synced failed: reason withheld Mar 21 03:49:54 crc kubenswrapper[4685]: [+]process-running ok Mar 21 03:49:54 crc kubenswrapper[4685]: healthz check failed Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.304578 4685 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vds97" podUID="b13d8427-dcef-4925-92b6-0e6bf1aca8c8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.335703 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2dcgr"] Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.338745 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2dcgr" Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.347464 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2dcgr"] Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.353559 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.353773 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4ebdf18-8426-42cc-93a6-60b46261aebe-utilities\") pod \"community-operators-7swf2\" (UID: \"d4ebdf18-8426-42cc-93a6-60b46261aebe\") " pod="openshift-marketplace/community-operators-7swf2" Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.353821 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4ebdf18-8426-42cc-93a6-60b46261aebe-catalog-content\") pod \"community-operators-7swf2\" (UID: \"d4ebdf18-8426-42cc-93a6-60b46261aebe\") " pod="openshift-marketplace/community-operators-7swf2" Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.353886 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zktx\" (UniqueName: \"kubernetes.io/projected/d4ebdf18-8426-42cc-93a6-60b46261aebe-kube-api-access-5zktx\") pod \"community-operators-7swf2\" (UID: \"d4ebdf18-8426-42cc-93a6-60b46261aebe\") " pod="openshift-marketplace/community-operators-7swf2" Mar 21 03:49:54 crc kubenswrapper[4685]: E0321 03:49:54.354205 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 03:49:54.854191872 +0000 UTC m=+227.331260664 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.354517 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4ebdf18-8426-42cc-93a6-60b46261aebe-utilities\") pod \"community-operators-7swf2\" (UID: \"d4ebdf18-8426-42cc-93a6-60b46261aebe\") " pod="openshift-marketplace/community-operators-7swf2" Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.354712 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4ebdf18-8426-42cc-93a6-60b46261aebe-catalog-content\") pod \"community-operators-7swf2\" (UID: \"d4ebdf18-8426-42cc-93a6-60b46261aebe\") " pod="openshift-marketplace/community-operators-7swf2" Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.366983 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-plggv" Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.388894 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zktx\" (UniqueName: \"kubernetes.io/projected/d4ebdf18-8426-42cc-93a6-60b46261aebe-kube-api-access-5zktx\") pod \"community-operators-7swf2\" (UID: \"d4ebdf18-8426-42cc-93a6-60b46261aebe\") " pod="openshift-marketplace/community-operators-7swf2" Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.417704 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-mkjd2" Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.419589 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c5b579c85-zbdrh"] Mar 21 03:49:54 crc kubenswrapper[4685]: E0321 03:49:54.419796 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd1c8c06-710c-401b-803e-9cc18aa1b4b6" containerName="controller-manager" Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.419807 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd1c8c06-710c-401b-803e-9cc18aa1b4b6" containerName="controller-manager" Mar 21 03:49:54 crc kubenswrapper[4685]: E0321 03:49:54.419821 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="588eb87c-d2c0-45fb-a0f7-33de36d5d745" containerName="route-controller-manager" Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.419829 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="588eb87c-d2c0-45fb-a0f7-33de36d5d745" containerName="route-controller-manager" Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.428174 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="588eb87c-d2c0-45fb-a0f7-33de36d5d745" containerName="route-controller-manager" Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.428201 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd1c8c06-710c-401b-803e-9cc18aa1b4b6" containerName="controller-manager" Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.428614 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7c5b579c85-zbdrh" Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.436000 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c5b579c85-zbdrh"] Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.461144 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd1c8c06-710c-401b-803e-9cc18aa1b4b6-serving-cert\") pod \"cd1c8c06-710c-401b-803e-9cc18aa1b4b6\" (UID: \"cd1c8c06-710c-401b-803e-9cc18aa1b4b6\") " Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.461199 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/588eb87c-d2c0-45fb-a0f7-33de36d5d745-serving-cert\") pod \"588eb87c-d2c0-45fb-a0f7-33de36d5d745\" (UID: \"588eb87c-d2c0-45fb-a0f7-33de36d5d745\") " Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.461241 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hbxpx\" (UniqueName: \"kubernetes.io/projected/588eb87c-d2c0-45fb-a0f7-33de36d5d745-kube-api-access-hbxpx\") pod \"588eb87c-d2c0-45fb-a0f7-33de36d5d745\" (UID: \"588eb87c-d2c0-45fb-a0f7-33de36d5d745\") " Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.461279 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/588eb87c-d2c0-45fb-a0f7-33de36d5d745-client-ca\") pod \"588eb87c-d2c0-45fb-a0f7-33de36d5d745\" (UID: \"588eb87c-d2c0-45fb-a0f7-33de36d5d745\") " Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.461303 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cd1c8c06-710c-401b-803e-9cc18aa1b4b6-proxy-ca-bundles\") pod \"cd1c8c06-710c-401b-803e-9cc18aa1b4b6\" (UID: \"cd1c8c06-710c-401b-803e-9cc18aa1b4b6\") " Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.461350 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd1c8c06-710c-401b-803e-9cc18aa1b4b6-config\") pod \"cd1c8c06-710c-401b-803e-9cc18aa1b4b6\" (UID: \"cd1c8c06-710c-401b-803e-9cc18aa1b4b6\") " Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.461370 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6b5xg\" (UniqueName: \"kubernetes.io/projected/cd1c8c06-710c-401b-803e-9cc18aa1b4b6-kube-api-access-6b5xg\") pod \"cd1c8c06-710c-401b-803e-9cc18aa1b4b6\" (UID: \"cd1c8c06-710c-401b-803e-9cc18aa1b4b6\") " Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.461389 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/588eb87c-d2c0-45fb-a0f7-33de36d5d745-config\") pod \"588eb87c-d2c0-45fb-a0f7-33de36d5d745\" (UID: \"588eb87c-d2c0-45fb-a0f7-33de36d5d745\") " Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.461406 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cd1c8c06-710c-401b-803e-9cc18aa1b4b6-client-ca\") pod \"cd1c8c06-710c-401b-803e-9cc18aa1b4b6\" (UID: \"cd1c8c06-710c-401b-803e-9cc18aa1b4b6\") " Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.461741 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7b276b3-8b85-4cfe-a39c-73da270336e3-catalog-content\") pod \"certified-operators-2dcgr\" (UID: \"f7b276b3-8b85-4cfe-a39c-73da270336e3\") " pod="openshift-marketplace/certified-operators-2dcgr" Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.461778 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7b276b3-8b85-4cfe-a39c-73da270336e3-utilities\") pod \"certified-operators-2dcgr\" (UID: \"f7b276b3-8b85-4cfe-a39c-73da270336e3\") " pod="openshift-marketplace/certified-operators-2dcgr" Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.461807 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.461873 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dncj\" (UniqueName: \"kubernetes.io/projected/f7b276b3-8b85-4cfe-a39c-73da270336e3-kube-api-access-7dncj\") pod \"certified-operators-2dcgr\" (UID: \"f7b276b3-8b85-4cfe-a39c-73da270336e3\") " pod="openshift-marketplace/certified-operators-2dcgr" Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.466777 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/588eb87c-d2c0-45fb-a0f7-33de36d5d745-client-ca" (OuterVolumeSpecName: "client-ca") pod "588eb87c-d2c0-45fb-a0f7-33de36d5d745" (UID: "588eb87c-d2c0-45fb-a0f7-33de36d5d745"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.470224 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/588eb87c-d2c0-45fb-a0f7-33de36d5d745-config" (OuterVolumeSpecName: "config") pod "588eb87c-d2c0-45fb-a0f7-33de36d5d745" (UID: "588eb87c-d2c0-45fb-a0f7-33de36d5d745"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.470288 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd1c8c06-710c-401b-803e-9cc18aa1b4b6-client-ca" (OuterVolumeSpecName: "client-ca") pod "cd1c8c06-710c-401b-803e-9cc18aa1b4b6" (UID: "cd1c8c06-710c-401b-803e-9cc18aa1b4b6"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:49:54 crc kubenswrapper[4685]: E0321 03:49:54.470666 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 03:49:54.970644171 +0000 UTC m=+227.447712963 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pjvx8" (UID: "5a823511-d878-4e6d-acda-4202e00e3aab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.472512 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7swf2" Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.475628 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd1c8c06-710c-401b-803e-9cc18aa1b4b6-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "cd1c8c06-710c-401b-803e-9cc18aa1b4b6" (UID: "cd1c8c06-710c-401b-803e-9cc18aa1b4b6"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.475789 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd1c8c06-710c-401b-803e-9cc18aa1b4b6-config" (OuterVolumeSpecName: "config") pod "cd1c8c06-710c-401b-803e-9cc18aa1b4b6" (UID: "cd1c8c06-710c-401b-803e-9cc18aa1b4b6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.478028 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/588eb87c-d2c0-45fb-a0f7-33de36d5d745-kube-api-access-hbxpx" (OuterVolumeSpecName: "kube-api-access-hbxpx") pod "588eb87c-d2c0-45fb-a0f7-33de36d5d745" (UID: "588eb87c-d2c0-45fb-a0f7-33de36d5d745"). InnerVolumeSpecName "kube-api-access-hbxpx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.490227 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/588eb87c-d2c0-45fb-a0f7-33de36d5d745-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "588eb87c-d2c0-45fb-a0f7-33de36d5d745" (UID: "588eb87c-d2c0-45fb-a0f7-33de36d5d745"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.491268 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd1c8c06-710c-401b-803e-9cc18aa1b4b6-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "cd1c8c06-710c-401b-803e-9cc18aa1b4b6" (UID: "cd1c8c06-710c-401b-803e-9cc18aa1b4b6"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.504447 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd1c8c06-710c-401b-803e-9cc18aa1b4b6-kube-api-access-6b5xg" (OuterVolumeSpecName: "kube-api-access-6b5xg") pod "cd1c8c06-710c-401b-803e-9cc18aa1b4b6" (UID: "cd1c8c06-710c-401b-803e-9cc18aa1b4b6"). InnerVolumeSpecName "kube-api-access-6b5xg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.514711 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-plggv" event={"ID":"588eb87c-d2c0-45fb-a0f7-33de36d5d745","Type":"ContainerDied","Data":"83be5ab2f9a927228dede71a95aaf214baa9ba4f46e25f24f7cc896ee00a97d6"} Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.514762 4685 scope.go:117] "RemoveContainer" containerID="1c65935116820cb8f53969211726b48113307e8d74c8fca574cb4bca5111d41d" Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.515024 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-plggv" Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.557043 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-mkjd2" event={"ID":"cd1c8c06-710c-401b-803e-9cc18aa1b4b6","Type":"ContainerDied","Data":"25366635beebe6b2f8548a1976915b7b51b8c578a70ba4d2a290c186dc0de89c"} Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.557393 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-mkjd2" Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.562366 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.562590 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7b276b3-8b85-4cfe-a39c-73da270336e3-catalog-content\") pod \"certified-operators-2dcgr\" (UID: \"f7b276b3-8b85-4cfe-a39c-73da270336e3\") " pod="openshift-marketplace/certified-operators-2dcgr" Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.562632 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b246afbf-fbd5-4f39-9c95-08aab014dda0-serving-cert\") pod \"route-controller-manager-7c5b579c85-zbdrh\" (UID: \"b246afbf-fbd5-4f39-9c95-08aab014dda0\") " pod="openshift-route-controller-manager/route-controller-manager-7c5b579c85-zbdrh" Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.562651 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7b276b3-8b85-4cfe-a39c-73da270336e3-utilities\") pod \"certified-operators-2dcgr\" (UID: \"f7b276b3-8b85-4cfe-a39c-73da270336e3\") " pod="openshift-marketplace/certified-operators-2dcgr" Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.562686 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b246afbf-fbd5-4f39-9c95-08aab014dda0-config\") pod \"route-controller-manager-7c5b579c85-zbdrh\" (UID: \"b246afbf-fbd5-4f39-9c95-08aab014dda0\") " pod="openshift-route-controller-manager/route-controller-manager-7c5b579c85-zbdrh" Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.562703 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b246afbf-fbd5-4f39-9c95-08aab014dda0-client-ca\") pod \"route-controller-manager-7c5b579c85-zbdrh\" (UID: \"b246afbf-fbd5-4f39-9c95-08aab014dda0\") " pod="openshift-route-controller-manager/route-controller-manager-7c5b579c85-zbdrh" Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.562737 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dncj\" (UniqueName: \"kubernetes.io/projected/f7b276b3-8b85-4cfe-a39c-73da270336e3-kube-api-access-7dncj\") pod \"certified-operators-2dcgr\" (UID: \"f7b276b3-8b85-4cfe-a39c-73da270336e3\") " pod="openshift-marketplace/certified-operators-2dcgr" Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.562788 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-np9mh\" (UniqueName: \"kubernetes.io/projected/b246afbf-fbd5-4f39-9c95-08aab014dda0-kube-api-access-np9mh\") pod \"route-controller-manager-7c5b579c85-zbdrh\" (UID: \"b246afbf-fbd5-4f39-9c95-08aab014dda0\") " pod="openshift-route-controller-manager/route-controller-manager-7c5b579c85-zbdrh" Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.562823 4685 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd1c8c06-710c-401b-803e-9cc18aa1b4b6-config\") on node \"crc\" DevicePath \"\"" Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.562849 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6b5xg\" (UniqueName: \"kubernetes.io/projected/cd1c8c06-710c-401b-803e-9cc18aa1b4b6-kube-api-access-6b5xg\") on node \"crc\" DevicePath \"\"" Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.562858 4685 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cd1c8c06-710c-401b-803e-9cc18aa1b4b6-client-ca\") on node \"crc\" DevicePath \"\"" Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.562866 4685 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/588eb87c-d2c0-45fb-a0f7-33de36d5d745-config\") on node \"crc\" DevicePath \"\"" Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.562874 4685 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd1c8c06-710c-401b-803e-9cc18aa1b4b6-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.562882 4685 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/588eb87c-d2c0-45fb-a0f7-33de36d5d745-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.562890 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hbxpx\" (UniqueName: \"kubernetes.io/projected/588eb87c-d2c0-45fb-a0f7-33de36d5d745-kube-api-access-hbxpx\") on node \"crc\" DevicePath \"\"" Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.562898 4685 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/588eb87c-d2c0-45fb-a0f7-33de36d5d745-client-ca\") on node \"crc\" DevicePath \"\"" Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.562906 4685 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cd1c8c06-710c-401b-803e-9cc18aa1b4b6-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 21 03:49:54 crc kubenswrapper[4685]: E0321 03:49:54.562966 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 03:49:55.062952449 +0000 UTC m=+227.540021241 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.563297 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7b276b3-8b85-4cfe-a39c-73da270336e3-catalog-content\") pod \"certified-operators-2dcgr\" (UID: \"f7b276b3-8b85-4cfe-a39c-73da270336e3\") " pod="openshift-marketplace/certified-operators-2dcgr" Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.565117 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7b276b3-8b85-4cfe-a39c-73da270336e3-utilities\") pod \"certified-operators-2dcgr\" (UID: \"f7b276b3-8b85-4cfe-a39c-73da270336e3\") " pod="openshift-marketplace/certified-operators-2dcgr" Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.573810 4685 scope.go:117] "RemoveContainer" containerID="74e90300e77c4a10664ef3e64ba0c75e3b8d29360bcbddcadac288be6716b6ba" Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.587658 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-plggv"] Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.600594 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dncj\" (UniqueName: \"kubernetes.io/projected/f7b276b3-8b85-4cfe-a39c-73da270336e3-kube-api-access-7dncj\") pod \"certified-operators-2dcgr\" (UID: \"f7b276b3-8b85-4cfe-a39c-73da270336e3\") " pod="openshift-marketplace/certified-operators-2dcgr" Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.611534 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-plggv"] Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.639010 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mhc9s"] Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.666222 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-np9mh\" (UniqueName: \"kubernetes.io/projected/b246afbf-fbd5-4f39-9c95-08aab014dda0-kube-api-access-np9mh\") pod \"route-controller-manager-7c5b579c85-zbdrh\" (UID: \"b246afbf-fbd5-4f39-9c95-08aab014dda0\") " pod="openshift-route-controller-manager/route-controller-manager-7c5b579c85-zbdrh" Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.666272 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b246afbf-fbd5-4f39-9c95-08aab014dda0-serving-cert\") pod \"route-controller-manager-7c5b579c85-zbdrh\" (UID: \"b246afbf-fbd5-4f39-9c95-08aab014dda0\") " pod="openshift-route-controller-manager/route-controller-manager-7c5b579c85-zbdrh" Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.666305 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.666338 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b246afbf-fbd5-4f39-9c95-08aab014dda0-config\") pod \"route-controller-manager-7c5b579c85-zbdrh\" (UID: \"b246afbf-fbd5-4f39-9c95-08aab014dda0\") " pod="openshift-route-controller-manager/route-controller-manager-7c5b579c85-zbdrh" Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.666353 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b246afbf-fbd5-4f39-9c95-08aab014dda0-client-ca\") pod \"route-controller-manager-7c5b579c85-zbdrh\" (UID: \"b246afbf-fbd5-4f39-9c95-08aab014dda0\") " pod="openshift-route-controller-manager/route-controller-manager-7c5b579c85-zbdrh" Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.667387 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b246afbf-fbd5-4f39-9c95-08aab014dda0-client-ca\") pod \"route-controller-manager-7c5b579c85-zbdrh\" (UID: \"b246afbf-fbd5-4f39-9c95-08aab014dda0\") " pod="openshift-route-controller-manager/route-controller-manager-7c5b579c85-zbdrh" Mar 21 03:49:54 crc kubenswrapper[4685]: E0321 03:49:54.668171 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 03:49:55.168155757 +0000 UTC m=+227.645224549 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pjvx8" (UID: "5a823511-d878-4e6d-acda-4202e00e3aab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.669157 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b246afbf-fbd5-4f39-9c95-08aab014dda0-config\") pod \"route-controller-manager-7c5b579c85-zbdrh\" (UID: \"b246afbf-fbd5-4f39-9c95-08aab014dda0\") " pod="openshift-route-controller-manager/route-controller-manager-7c5b579c85-zbdrh" Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.672906 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-mkjd2"] Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.676684 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-mkjd2"] Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.677330 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b246afbf-fbd5-4f39-9c95-08aab014dda0-serving-cert\") pod \"route-controller-manager-7c5b579c85-zbdrh\" (UID: \"b246afbf-fbd5-4f39-9c95-08aab014dda0\") " pod="openshift-route-controller-manager/route-controller-manager-7c5b579c85-zbdrh" Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.690108 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2dcgr" Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.743474 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-np9mh\" (UniqueName: \"kubernetes.io/projected/b246afbf-fbd5-4f39-9c95-08aab014dda0-kube-api-access-np9mh\") pod \"route-controller-manager-7c5b579c85-zbdrh\" (UID: \"b246afbf-fbd5-4f39-9c95-08aab014dda0\") " pod="openshift-route-controller-manager/route-controller-manager-7c5b579c85-zbdrh" Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.769363 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 03:49:54 crc kubenswrapper[4685]: E0321 03:49:54.769676 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 03:49:55.269658703 +0000 UTC m=+227.746727495 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.772968 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7c5b579c85-zbdrh" Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.818633 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xvjs7"] Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.870419 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:54 crc kubenswrapper[4685]: E0321 03:49:54.870770 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 03:49:55.370757457 +0000 UTC m=+227.847826239 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pjvx8" (UID: "5a823511-d878-4e6d-acda-4202e00e3aab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:54 crc kubenswrapper[4685]: I0321 03:49:54.984049 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 03:49:54 crc kubenswrapper[4685]: E0321 03:49:54.984544 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 03:49:55.484529285 +0000 UTC m=+227.961598077 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:55 crc kubenswrapper[4685]: I0321 03:49:55.042886 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7swf2"] Mar 21 03:49:55 crc kubenswrapper[4685]: I0321 03:49:55.072086 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 21 03:49:55 crc kubenswrapper[4685]: I0321 03:49:55.072707 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 21 03:49:55 crc kubenswrapper[4685]: I0321 03:49:55.086529 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:55 crc kubenswrapper[4685]: E0321 03:49:55.086897 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 03:49:55.586885677 +0000 UTC m=+228.063954469 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pjvx8" (UID: "5a823511-d878-4e6d-acda-4202e00e3aab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:55 crc kubenswrapper[4685]: I0321 03:49:55.087146 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 21 03:49:55 crc kubenswrapper[4685]: I0321 03:49:55.087245 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 21 03:49:55 crc kubenswrapper[4685]: I0321 03:49:55.087273 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 21 03:49:55 crc kubenswrapper[4685]: I0321 03:49:55.174278 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 21 03:49:55 crc kubenswrapper[4685]: I0321 03:49:55.176612 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2ptjk" Mar 21 03:49:55 crc kubenswrapper[4685]: I0321 03:49:55.177238 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 21 03:49:55 crc kubenswrapper[4685]: I0321 03:49:55.180205 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 21 03:49:55 crc kubenswrapper[4685]: I0321 03:49:55.181940 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 21 03:49:55 crc kubenswrapper[4685]: I0321 03:49:55.190250 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 21 03:49:55 crc kubenswrapper[4685]: I0321 03:49:55.190819 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 03:49:55 crc kubenswrapper[4685]: E0321 03:49:55.190968 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 03:49:55.69094104 +0000 UTC m=+228.168009832 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:55 crc kubenswrapper[4685]: I0321 03:49:55.191101 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2718979f-2e99-4a55-a1b3-71c8b79dd1cb-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"2718979f-2e99-4a55-a1b3-71c8b79dd1cb\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 21 03:49:55 crc kubenswrapper[4685]: I0321 03:49:55.191193 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:55 crc kubenswrapper[4685]: I0321 03:49:55.191221 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2718979f-2e99-4a55-a1b3-71c8b79dd1cb-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"2718979f-2e99-4a55-a1b3-71c8b79dd1cb\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 21 03:49:55 crc kubenswrapper[4685]: E0321 03:49:55.191537 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 03:49:55.691524008 +0000 UTC m=+228.168592810 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pjvx8" (UID: "5a823511-d878-4e6d-acda-4202e00e3aab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:55 crc kubenswrapper[4685]: I0321 03:49:55.249091 4685 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 21 03:49:55 crc kubenswrapper[4685]: I0321 03:49:55.293408 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 03:49:55 crc kubenswrapper[4685]: E0321 03:49:55.293760 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 03:49:55.793738765 +0000 UTC m=+228.270807557 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:55 crc kubenswrapper[4685]: I0321 03:49:55.294046 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/12290feb-3667-4dd9-9351-0865b9b9757e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"12290feb-3667-4dd9-9351-0865b9b9757e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 21 03:49:55 crc kubenswrapper[4685]: I0321 03:49:55.294083 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:55 crc kubenswrapper[4685]: I0321 03:49:55.294105 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2718979f-2e99-4a55-a1b3-71c8b79dd1cb-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"2718979f-2e99-4a55-a1b3-71c8b79dd1cb\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 21 03:49:55 crc kubenswrapper[4685]: I0321 03:49:55.294156 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2718979f-2e99-4a55-a1b3-71c8b79dd1cb-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"2718979f-2e99-4a55-a1b3-71c8b79dd1cb\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 21 03:49:55 crc kubenswrapper[4685]: I0321 03:49:55.294192 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/12290feb-3667-4dd9-9351-0865b9b9757e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"12290feb-3667-4dd9-9351-0865b9b9757e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 21 03:49:55 crc kubenswrapper[4685]: I0321 03:49:55.294375 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2718979f-2e99-4a55-a1b3-71c8b79dd1cb-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"2718979f-2e99-4a55-a1b3-71c8b79dd1cb\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 21 03:49:55 crc kubenswrapper[4685]: E0321 03:49:55.295268 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 03:49:55.795260112 +0000 UTC m=+228.272328904 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pjvx8" (UID: "5a823511-d878-4e6d-acda-4202e00e3aab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:55 crc kubenswrapper[4685]: I0321 03:49:55.304899 4685 patch_prober.go:28] interesting pod/router-default-5444994796-vds97 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 03:49:55 crc kubenswrapper[4685]: [-]has-synced failed: reason withheld Mar 21 03:49:55 crc kubenswrapper[4685]: [+]process-running ok Mar 21 03:49:55 crc kubenswrapper[4685]: healthz check failed Mar 21 03:49:55 crc kubenswrapper[4685]: I0321 03:49:55.304986 4685 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vds97" podUID="b13d8427-dcef-4925-92b6-0e6bf1aca8c8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 03:49:55 crc kubenswrapper[4685]: I0321 03:49:55.312903 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2718979f-2e99-4a55-a1b3-71c8b79dd1cb-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"2718979f-2e99-4a55-a1b3-71c8b79dd1cb\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 21 03:49:55 crc kubenswrapper[4685]: I0321 03:49:55.321874 4685 patch_prober.go:28] interesting pod/apiserver-76f77b778f-864h4 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 21 03:49:55 crc kubenswrapper[4685]: [+]log ok Mar 21 03:49:55 crc kubenswrapper[4685]: [+]etcd ok Mar 21 03:49:55 crc kubenswrapper[4685]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 21 03:49:55 crc kubenswrapper[4685]: [+]poststarthook/generic-apiserver-start-informers ok Mar 21 03:49:55 crc kubenswrapper[4685]: [+]poststarthook/max-in-flight-filter ok Mar 21 03:49:55 crc kubenswrapper[4685]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 21 03:49:55 crc kubenswrapper[4685]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 21 03:49:55 crc kubenswrapper[4685]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 21 03:49:55 crc kubenswrapper[4685]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Mar 21 03:49:55 crc kubenswrapper[4685]: [+]poststarthook/project.openshift.io-projectcache ok Mar 21 03:49:55 crc kubenswrapper[4685]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 21 03:49:55 crc kubenswrapper[4685]: [+]poststarthook/openshift.io-startinformers ok Mar 21 03:49:55 crc kubenswrapper[4685]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 21 03:49:55 crc kubenswrapper[4685]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 21 03:49:55 crc kubenswrapper[4685]: livez check failed Mar 21 03:49:55 crc kubenswrapper[4685]: I0321 03:49:55.321926 4685 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-864h4" podUID="9c1c82f3-080b-47ea-93df-596d79aa2bf8" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 03:49:55 crc kubenswrapper[4685]: I0321 03:49:55.395192 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 03:49:55 crc kubenswrapper[4685]: I0321 03:49:55.395761 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/12290feb-3667-4dd9-9351-0865b9b9757e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"12290feb-3667-4dd9-9351-0865b9b9757e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 21 03:49:55 crc kubenswrapper[4685]: I0321 03:49:55.395812 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/12290feb-3667-4dd9-9351-0865b9b9757e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"12290feb-3667-4dd9-9351-0865b9b9757e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 21 03:49:55 crc kubenswrapper[4685]: E0321 03:49:55.396317 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 03:49:55.896303624 +0000 UTC m=+228.373372416 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:55 crc kubenswrapper[4685]: I0321 03:49:55.396376 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/12290feb-3667-4dd9-9351-0865b9b9757e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"12290feb-3667-4dd9-9351-0865b9b9757e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 21 03:49:55 crc kubenswrapper[4685]: I0321 03:49:55.419541 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/12290feb-3667-4dd9-9351-0865b9b9757e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"12290feb-3667-4dd9-9351-0865b9b9757e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 21 03:49:55 crc kubenswrapper[4685]: I0321 03:49:55.442232 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c5b579c85-zbdrh"] Mar 21 03:49:55 crc kubenswrapper[4685]: I0321 03:49:55.455327 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2dcgr"] Mar 21 03:49:55 crc kubenswrapper[4685]: W0321 03:49:55.478620 4685 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7b276b3_8b85_4cfe_a39c_73da270336e3.slice/crio-764e4d787c3552c5a132db4bb0c6be5e3a7f4c1ae5ac6d13e8c12b6397479ad1 WatchSource:0}: Error finding container 764e4d787c3552c5a132db4bb0c6be5e3a7f4c1ae5ac6d13e8c12b6397479ad1: Status 404 returned error can't find the container with id 764e4d787c3552c5a132db4bb0c6be5e3a7f4c1ae5ac6d13e8c12b6397479ad1 Mar 21 03:49:55 crc kubenswrapper[4685]: I0321 03:49:55.496794 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:55 crc kubenswrapper[4685]: E0321 03:49:55.497158 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 03:49:55.99714691 +0000 UTC m=+228.474215702 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pjvx8" (UID: "5a823511-d878-4e6d-acda-4202e00e3aab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 03:49:55 crc kubenswrapper[4685]: I0321 03:49:55.512239 4685 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-21T03:49:55.249116873Z","Handler":null,"Name":""} Mar 21 03:49:55 crc kubenswrapper[4685]: I0321 03:49:55.519139 4685 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 21 03:49:55 crc kubenswrapper[4685]: I0321 03:49:55.519176 4685 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 21 03:49:55 crc kubenswrapper[4685]: I0321 03:49:55.534254 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 21 03:49:55 crc kubenswrapper[4685]: I0321 03:49:55.567995 4685 generic.go:334] "Generic (PLEG): container finished" podID="931ed0e7-7ffb-48ba-92b0-28883a6f0b39" containerID="2d30a009615dd8540d7c08261ee0942dc7adaa78396509206fadc21d0befa89d" exitCode=0 Mar 21 03:49:55 crc kubenswrapper[4685]: I0321 03:49:55.568065 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mhc9s" event={"ID":"931ed0e7-7ffb-48ba-92b0-28883a6f0b39","Type":"ContainerDied","Data":"2d30a009615dd8540d7c08261ee0942dc7adaa78396509206fadc21d0befa89d"} Mar 21 03:49:55 crc kubenswrapper[4685]: I0321 03:49:55.568099 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mhc9s" event={"ID":"931ed0e7-7ffb-48ba-92b0-28883a6f0b39","Type":"ContainerStarted","Data":"1859afe92c3043eba35a35a14ff3ee7043637b608c0e4d98d6e7be5830f58b89"} Mar 21 03:49:55 crc kubenswrapper[4685]: I0321 03:49:55.571750 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7c5b579c85-zbdrh" event={"ID":"b246afbf-fbd5-4f39-9c95-08aab014dda0","Type":"ContainerStarted","Data":"fa68fee35f1ebb62703e132435761b3d42b001d026111b18a30f0e2594ad0e05"} Mar 21 03:49:55 crc kubenswrapper[4685]: I0321 03:49:55.575969 4685 generic.go:334] "Generic (PLEG): container finished" podID="d4ebdf18-8426-42cc-93a6-60b46261aebe" containerID="8baa23e2996f7d8a9daba457d2de4f70d70272f6a61170df0cdd73c0fef91103" exitCode=0 Mar 21 03:49:55 crc kubenswrapper[4685]: I0321 03:49:55.576045 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7swf2" event={"ID":"d4ebdf18-8426-42cc-93a6-60b46261aebe","Type":"ContainerDied","Data":"8baa23e2996f7d8a9daba457d2de4f70d70272f6a61170df0cdd73c0fef91103"} Mar 21 03:49:55 crc kubenswrapper[4685]: I0321 03:49:55.576073 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7swf2" event={"ID":"d4ebdf18-8426-42cc-93a6-60b46261aebe","Type":"ContainerStarted","Data":"2801d5cc66b2ed27f43d05ec14001e37a33f3305271ce03aba83a57ba3e6fcac"} Mar 21 03:49:55 crc kubenswrapper[4685]: I0321 03:49:55.577442 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2dcgr" event={"ID":"f7b276b3-8b85-4cfe-a39c-73da270336e3","Type":"ContainerStarted","Data":"764e4d787c3552c5a132db4bb0c6be5e3a7f4c1ae5ac6d13e8c12b6397479ad1"} Mar 21 03:49:55 crc kubenswrapper[4685]: I0321 03:49:55.582311 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 21 03:49:55 crc kubenswrapper[4685]: I0321 03:49:55.584712 4685 generic.go:334] "Generic (PLEG): container finished" podID="53ad7ff2-a7d3-4ad6-8e97-14542e4a99cd" containerID="0d275f816389391c63c170323b6c6b30d2491a63ef326c02e04fa5d6eee4b719" exitCode=0 Mar 21 03:49:55 crc kubenswrapper[4685]: I0321 03:49:55.584769 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xvjs7" event={"ID":"53ad7ff2-a7d3-4ad6-8e97-14542e4a99cd","Type":"ContainerDied","Data":"0d275f816389391c63c170323b6c6b30d2491a63ef326c02e04fa5d6eee4b719"} Mar 21 03:49:55 crc kubenswrapper[4685]: I0321 03:49:55.584792 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xvjs7" event={"ID":"53ad7ff2-a7d3-4ad6-8e97-14542e4a99cd","Type":"ContainerStarted","Data":"c712b4fea9a21c3aceff31bbcdd94562437e64de7d6cd12c4e3c1c4fe3c45589"} Mar 21 03:49:55 crc kubenswrapper[4685]: I0321 03:49:55.596686 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-2pgbl" event={"ID":"b695978e-b67c-4812-9083-22538cdd3045","Type":"ContainerStarted","Data":"3d420d771a1dbdbb1bcb8e02386250ca8053ffb7a87ffdab47fbe6ad098df70e"} Mar 21 03:49:55 crc kubenswrapper[4685]: I0321 03:49:55.596724 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-2pgbl" event={"ID":"b695978e-b67c-4812-9083-22538cdd3045","Type":"ContainerStarted","Data":"651e9773d7a869a9b7de3a993e4114827dde28d6d7f3cbbd41336cf7d8bffa6e"} Mar 21 03:49:55 crc kubenswrapper[4685]: I0321 03:49:55.598210 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 03:49:55 crc kubenswrapper[4685]: I0321 03:49:55.627324 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 21 03:49:55 crc kubenswrapper[4685]: I0321 03:49:55.700013 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:55 crc kubenswrapper[4685]: I0321 03:49:55.703897 4685 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 21 03:49:55 crc kubenswrapper[4685]: I0321 03:49:55.703928 4685 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:55 crc kubenswrapper[4685]: I0321 03:49:55.772776 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pjvx8\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:55 crc kubenswrapper[4685]: I0321 03:49:55.827255 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:55 crc kubenswrapper[4685]: I0321 03:49:55.856124 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 21 03:49:55 crc kubenswrapper[4685]: W0321 03:49:55.877388 4685 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod2718979f_2e99_4a55_a1b3_71c8b79dd1cb.slice/crio-a562324d6d8ea9a650e26f76471f13f676a133ec7c2048c4d11a03d7f371c895 WatchSource:0}: Error finding container a562324d6d8ea9a650e26f76471f13f676a133ec7c2048c4d11a03d7f371c895: Status 404 returned error can't find the container with id a562324d6d8ea9a650e26f76471f13f676a133ec7c2048c4d11a03d7f371c895 Mar 21 03:49:55 crc kubenswrapper[4685]: I0321 03:49:55.889261 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 21 03:49:55 crc kubenswrapper[4685]: W0321 03:49:55.900694 4685 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod12290feb_3667_4dd9_9351_0865b9b9757e.slice/crio-05137d5071072a23cdbbf9eb81306a0d3344a3ad55b1f2867d69a49800cdd318 WatchSource:0}: Error finding container 05137d5071072a23cdbbf9eb81306a0d3344a3ad55b1f2867d69a49800cdd318: Status 404 returned error can't find the container with id 05137d5071072a23cdbbf9eb81306a0d3344a3ad55b1f2867d69a49800cdd318 Mar 21 03:49:55 crc kubenswrapper[4685]: I0321 03:49:55.920863 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-s6khh"] Mar 21 03:49:55 crc kubenswrapper[4685]: I0321 03:49:55.922067 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s6khh" Mar 21 03:49:55 crc kubenswrapper[4685]: I0321 03:49:55.926930 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 21 03:49:55 crc kubenswrapper[4685]: I0321 03:49:55.934739 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-s6khh"] Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.002769 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c1f4e4f-a993-423e-8922-d8b81967d483-utilities\") pod \"redhat-marketplace-s6khh\" (UID: \"9c1f4e4f-a993-423e-8922-d8b81967d483\") " pod="openshift-marketplace/redhat-marketplace-s6khh" Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.002909 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c1f4e4f-a993-423e-8922-d8b81967d483-catalog-content\") pod \"redhat-marketplace-s6khh\" (UID: \"9c1f4e4f-a993-423e-8922-d8b81967d483\") " pod="openshift-marketplace/redhat-marketplace-s6khh" Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.002956 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-px57g\" (UniqueName: \"kubernetes.io/projected/9c1f4e4f-a993-423e-8922-d8b81967d483-kube-api-access-px57g\") pod \"redhat-marketplace-s6khh\" (UID: \"9c1f4e4f-a993-423e-8922-d8b81967d483\") " pod="openshift-marketplace/redhat-marketplace-s6khh" Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.031492 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-pjvx8"] Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.104168 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c1f4e4f-a993-423e-8922-d8b81967d483-catalog-content\") pod \"redhat-marketplace-s6khh\" (UID: \"9c1f4e4f-a993-423e-8922-d8b81967d483\") " pod="openshift-marketplace/redhat-marketplace-s6khh" Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.103715 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c1f4e4f-a993-423e-8922-d8b81967d483-catalog-content\") pod \"redhat-marketplace-s6khh\" (UID: \"9c1f4e4f-a993-423e-8922-d8b81967d483\") " pod="openshift-marketplace/redhat-marketplace-s6khh" Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.104265 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-px57g\" (UniqueName: \"kubernetes.io/projected/9c1f4e4f-a993-423e-8922-d8b81967d483-kube-api-access-px57g\") pod \"redhat-marketplace-s6khh\" (UID: \"9c1f4e4f-a993-423e-8922-d8b81967d483\") " pod="openshift-marketplace/redhat-marketplace-s6khh" Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.104315 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c1f4e4f-a993-423e-8922-d8b81967d483-utilities\") pod \"redhat-marketplace-s6khh\" (UID: \"9c1f4e4f-a993-423e-8922-d8b81967d483\") " pod="openshift-marketplace/redhat-marketplace-s6khh" Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.104626 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c1f4e4f-a993-423e-8922-d8b81967d483-utilities\") pod \"redhat-marketplace-s6khh\" (UID: \"9c1f4e4f-a993-423e-8922-d8b81967d483\") " pod="openshift-marketplace/redhat-marketplace-s6khh" Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.131560 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-px57g\" (UniqueName: \"kubernetes.io/projected/9c1f4e4f-a993-423e-8922-d8b81967d483-kube-api-access-px57g\") pod \"redhat-marketplace-s6khh\" (UID: \"9c1f4e4f-a993-423e-8922-d8b81967d483\") " pod="openshift-marketplace/redhat-marketplace-s6khh" Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.204151 4685 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-j8qrk" Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.204215 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-j8qrk" Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.206092 4685 patch_prober.go:28] interesting pod/console-f9d7485db-j8qrk container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.37:8443/health\": dial tcp 10.217.0.37:8443: connect: connection refused" start-of-body= Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.206402 4685 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-j8qrk" podUID="49f645cb-7805-4ded-9f3e-d43bdb3801a6" containerName="console" probeResult="failure" output="Get \"https://10.217.0.37:8443/health\": dial tcp 10.217.0.37:8443: connect: connection refused" Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.226136 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-djlmv" Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.298709 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-vds97" Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.299273 4685 patch_prober.go:28] interesting pod/downloads-7954f5f757-clz2m container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.35:8080/\": dial tcp 10.217.0.35:8080: connect: connection refused" start-of-body= Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.299328 4685 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-clz2m" podUID="eb165aaf-36a6-4965-bebf-6a40e1695b94" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.35:8080/\": dial tcp 10.217.0.35:8080: connect: connection refused" Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.299505 4685 patch_prober.go:28] interesting pod/downloads-7954f5f757-clz2m container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.35:8080/\": dial tcp 10.217.0.35:8080: connect: connection refused" start-of-body= Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.299543 4685 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-clz2m" podUID="eb165aaf-36a6-4965-bebf-6a40e1695b94" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.35:8080/\": dial tcp 10.217.0.35:8080: connect: connection refused" Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.299659 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s6khh" Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.304670 4685 patch_prober.go:28] interesting pod/router-default-5444994796-vds97 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 03:49:56 crc kubenswrapper[4685]: [-]has-synced failed: reason withheld Mar 21 03:49:56 crc kubenswrapper[4685]: [+]process-running ok Mar 21 03:49:56 crc kubenswrapper[4685]: healthz check failed Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.304724 4685 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vds97" podUID="b13d8427-dcef-4925-92b6-0e6bf1aca8c8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.309669 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="588eb87c-d2c0-45fb-a0f7-33de36d5d745" path="/var/lib/kubelet/pods/588eb87c-d2c0-45fb-a0f7-33de36d5d745/volumes" Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.311099 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.311928 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd1c8c06-710c-401b-803e-9cc18aa1b4b6" path="/var/lib/kubelet/pods/cd1c8c06-710c-401b-803e-9cc18aa1b4b6/volumes" Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.322850 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zrnqq"] Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.323754 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zrnqq" Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.354809 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zrnqq"] Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.408052 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0e0a768-d2ec-4986-8cc7-72f0bd1d285a-utilities\") pod \"redhat-marketplace-zrnqq\" (UID: \"f0e0a768-d2ec-4986-8cc7-72f0bd1d285a\") " pod="openshift-marketplace/redhat-marketplace-zrnqq" Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.408108 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0e0a768-d2ec-4986-8cc7-72f0bd1d285a-catalog-content\") pod \"redhat-marketplace-zrnqq\" (UID: \"f0e0a768-d2ec-4986-8cc7-72f0bd1d285a\") " pod="openshift-marketplace/redhat-marketplace-zrnqq" Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.408134 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cz76z\" (UniqueName: \"kubernetes.io/projected/f0e0a768-d2ec-4986-8cc7-72f0bd1d285a-kube-api-access-cz76z\") pod \"redhat-marketplace-zrnqq\" (UID: \"f0e0a768-d2ec-4986-8cc7-72f0bd1d285a\") " pod="openshift-marketplace/redhat-marketplace-zrnqq" Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.481322 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-cf75f487f-fj8jm"] Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.482320 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-cf75f487f-fj8jm" Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.486100 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fb258" Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.486994 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.487225 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.487542 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.487819 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.488634 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.488874 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.490572 4685 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tb9dl" Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.490592 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tb9dl" Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.491503 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-cf75f487f-fj8jm"] Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.493547 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.510516 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0e0a768-d2ec-4986-8cc7-72f0bd1d285a-utilities\") pod \"redhat-marketplace-zrnqq\" (UID: \"f0e0a768-d2ec-4986-8cc7-72f0bd1d285a\") " pod="openshift-marketplace/redhat-marketplace-zrnqq" Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.510601 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0e0a768-d2ec-4986-8cc7-72f0bd1d285a-catalog-content\") pod \"redhat-marketplace-zrnqq\" (UID: \"f0e0a768-d2ec-4986-8cc7-72f0bd1d285a\") " pod="openshift-marketplace/redhat-marketplace-zrnqq" Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.510625 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/98f91583-6a92-4490-96d0-a15aa61f30bb-client-ca\") pod \"controller-manager-cf75f487f-fj8jm\" (UID: \"98f91583-6a92-4490-96d0-a15aa61f30bb\") " pod="openshift-controller-manager/controller-manager-cf75f487f-fj8jm" Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.510659 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmlbc\" (UniqueName: \"kubernetes.io/projected/98f91583-6a92-4490-96d0-a15aa61f30bb-kube-api-access-zmlbc\") pod \"controller-manager-cf75f487f-fj8jm\" (UID: \"98f91583-6a92-4490-96d0-a15aa61f30bb\") " pod="openshift-controller-manager/controller-manager-cf75f487f-fj8jm" Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.510679 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cz76z\" (UniqueName: \"kubernetes.io/projected/f0e0a768-d2ec-4986-8cc7-72f0bd1d285a-kube-api-access-cz76z\") pod \"redhat-marketplace-zrnqq\" (UID: \"f0e0a768-d2ec-4986-8cc7-72f0bd1d285a\") " pod="openshift-marketplace/redhat-marketplace-zrnqq" Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.510712 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98f91583-6a92-4490-96d0-a15aa61f30bb-serving-cert\") pod \"controller-manager-cf75f487f-fj8jm\" (UID: \"98f91583-6a92-4490-96d0-a15aa61f30bb\") " pod="openshift-controller-manager/controller-manager-cf75f487f-fj8jm" Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.510752 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98f91583-6a92-4490-96d0-a15aa61f30bb-config\") pod \"controller-manager-cf75f487f-fj8jm\" (UID: \"98f91583-6a92-4490-96d0-a15aa61f30bb\") " pod="openshift-controller-manager/controller-manager-cf75f487f-fj8jm" Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.510825 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/98f91583-6a92-4490-96d0-a15aa61f30bb-proxy-ca-bundles\") pod \"controller-manager-cf75f487f-fj8jm\" (UID: \"98f91583-6a92-4490-96d0-a15aa61f30bb\") " pod="openshift-controller-manager/controller-manager-cf75f487f-fj8jm" Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.511523 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0e0a768-d2ec-4986-8cc7-72f0bd1d285a-utilities\") pod \"redhat-marketplace-zrnqq\" (UID: \"f0e0a768-d2ec-4986-8cc7-72f0bd1d285a\") " pod="openshift-marketplace/redhat-marketplace-zrnqq" Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.512903 4685 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tb9dl" Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.513094 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0e0a768-d2ec-4986-8cc7-72f0bd1d285a-catalog-content\") pod \"redhat-marketplace-zrnqq\" (UID: \"f0e0a768-d2ec-4986-8cc7-72f0bd1d285a\") " pod="openshift-marketplace/redhat-marketplace-zrnqq" Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.611758 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-s6khh"] Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.612767 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmlbc\" (UniqueName: \"kubernetes.io/projected/98f91583-6a92-4490-96d0-a15aa61f30bb-kube-api-access-zmlbc\") pod \"controller-manager-cf75f487f-fj8jm\" (UID: \"98f91583-6a92-4490-96d0-a15aa61f30bb\") " pod="openshift-controller-manager/controller-manager-cf75f487f-fj8jm" Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.612846 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98f91583-6a92-4490-96d0-a15aa61f30bb-serving-cert\") pod \"controller-manager-cf75f487f-fj8jm\" (UID: \"98f91583-6a92-4490-96d0-a15aa61f30bb\") " pod="openshift-controller-manager/controller-manager-cf75f487f-fj8jm" Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.612884 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98f91583-6a92-4490-96d0-a15aa61f30bb-config\") pod \"controller-manager-cf75f487f-fj8jm\" (UID: \"98f91583-6a92-4490-96d0-a15aa61f30bb\") " pod="openshift-controller-manager/controller-manager-cf75f487f-fj8jm" Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.613189 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/98f91583-6a92-4490-96d0-a15aa61f30bb-proxy-ca-bundles\") pod \"controller-manager-cf75f487f-fj8jm\" (UID: \"98f91583-6a92-4490-96d0-a15aa61f30bb\") " pod="openshift-controller-manager/controller-manager-cf75f487f-fj8jm" Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.615133 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/98f91583-6a92-4490-96d0-a15aa61f30bb-client-ca\") pod \"controller-manager-cf75f487f-fj8jm\" (UID: \"98f91583-6a92-4490-96d0-a15aa61f30bb\") " pod="openshift-controller-manager/controller-manager-cf75f487f-fj8jm" Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.616210 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cz76z\" (UniqueName: \"kubernetes.io/projected/f0e0a768-d2ec-4986-8cc7-72f0bd1d285a-kube-api-access-cz76z\") pod \"redhat-marketplace-zrnqq\" (UID: \"f0e0a768-d2ec-4986-8cc7-72f0bd1d285a\") " pod="openshift-marketplace/redhat-marketplace-zrnqq" Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.616314 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/98f91583-6a92-4490-96d0-a15aa61f30bb-client-ca\") pod \"controller-manager-cf75f487f-fj8jm\" (UID: \"98f91583-6a92-4490-96d0-a15aa61f30bb\") " pod="openshift-controller-manager/controller-manager-cf75f487f-fj8jm" Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.617315 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/98f91583-6a92-4490-96d0-a15aa61f30bb-proxy-ca-bundles\") pod \"controller-manager-cf75f487f-fj8jm\" (UID: \"98f91583-6a92-4490-96d0-a15aa61f30bb\") " pod="openshift-controller-manager/controller-manager-cf75f487f-fj8jm" Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.619214 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98f91583-6a92-4490-96d0-a15aa61f30bb-config\") pod \"controller-manager-cf75f487f-fj8jm\" (UID: \"98f91583-6a92-4490-96d0-a15aa61f30bb\") " pod="openshift-controller-manager/controller-manager-cf75f487f-fj8jm" Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.621517 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98f91583-6a92-4490-96d0-a15aa61f30bb-serving-cert\") pod \"controller-manager-cf75f487f-fj8jm\" (UID: \"98f91583-6a92-4490-96d0-a15aa61f30bb\") " pod="openshift-controller-manager/controller-manager-cf75f487f-fj8jm" Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.636657 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmlbc\" (UniqueName: \"kubernetes.io/projected/98f91583-6a92-4490-96d0-a15aa61f30bb-kube-api-access-zmlbc\") pod \"controller-manager-cf75f487f-fj8jm\" (UID: \"98f91583-6a92-4490-96d0-a15aa61f30bb\") " pod="openshift-controller-manager/controller-manager-cf75f487f-fj8jm" Mar 21 03:49:56 crc kubenswrapper[4685]: W0321 03:49:56.652365 4685 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c1f4e4f_a993_423e_8922_d8b81967d483.slice/crio-b73d0a9e450c9d0b42d69d2692d9c06def486d16557e10ec98273207144ee2d9 WatchSource:0}: Error finding container b73d0a9e450c9d0b42d69d2692d9c06def486d16557e10ec98273207144ee2d9: Status 404 returned error can't find the container with id b73d0a9e450c9d0b42d69d2692d9c06def486d16557e10ec98273207144ee2d9 Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.652570 4685 generic.go:334] "Generic (PLEG): container finished" podID="0306bf0e-f0c1-4e47-b63e-909b979c5844" containerID="616371f1f698c4c3153138ce8dc8751f4b5600bcbbaed1da8fcabab8ea80aa1d" exitCode=0 Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.652642 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567745-czvxj" event={"ID":"0306bf0e-f0c1-4e47-b63e-909b979c5844","Type":"ContainerDied","Data":"616371f1f698c4c3153138ce8dc8751f4b5600bcbbaed1da8fcabab8ea80aa1d"} Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.652952 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-skpjj" Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.659268 4685 generic.go:334] "Generic (PLEG): container finished" podID="f7b276b3-8b85-4cfe-a39c-73da270336e3" containerID="ec70bcfacbcb092a27deb4937e29edf6b929d2ce6427e3f3d4290177836a63d5" exitCode=0 Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.659338 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2dcgr" event={"ID":"f7b276b3-8b85-4cfe-a39c-73da270336e3","Type":"ContainerDied","Data":"ec70bcfacbcb092a27deb4937e29edf6b929d2ce6427e3f3d4290177836a63d5"} Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.663654 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"12290feb-3667-4dd9-9351-0865b9b9757e","Type":"ContainerStarted","Data":"4cd22b39e28d2a4b604aa28a688a3572f38f0e4a6f408b9de41e98ea1fe4e0b0"} Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.663704 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"12290feb-3667-4dd9-9351-0865b9b9757e","Type":"ContainerStarted","Data":"05137d5071072a23cdbbf9eb81306a0d3344a3ad55b1f2867d69a49800cdd318"} Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.671788 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-skpjj" Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.672332 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zrnqq" Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.674902 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"2718979f-2e99-4a55-a1b3-71c8b79dd1cb","Type":"ContainerStarted","Data":"a39bf4d23469ce525698bc66ae5b05447e4776ca32e6f0d539bcc603ef1a0933"} Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.674955 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"2718979f-2e99-4a55-a1b3-71c8b79dd1cb","Type":"ContainerStarted","Data":"a562324d6d8ea9a650e26f76471f13f676a133ec7c2048c4d11a03d7f371c895"} Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.685046 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-2pgbl" event={"ID":"b695978e-b67c-4812-9083-22538cdd3045","Type":"ContainerStarted","Data":"0d785772cfb382376cb8a64c9c9d389166564a48469098b9ca0ad44d68fbe3f4"} Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.698450 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7c5b579c85-zbdrh" event={"ID":"b246afbf-fbd5-4f39-9c95-08aab014dda0","Type":"ContainerStarted","Data":"345685d8579eeb045c1610555ab08ac62f971fc6f14df13865bf2151b3d2739e"} Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.698512 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7c5b579c85-zbdrh" Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.703105 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7c5b579c85-zbdrh" Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.706026 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" event={"ID":"5a823511-d878-4e6d-acda-4202e00e3aab","Type":"ContainerStarted","Data":"b518ce61b84c42bf82a7212f4d9e88739f081f111fb39d9a384246703239366b"} Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.706125 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" event={"ID":"5a823511-d878-4e6d-acda-4202e00e3aab","Type":"ContainerStarted","Data":"d523779b5ef9b1c83fe719b88abac26dc872a10f67b416c1a1b169132b4b238e"} Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.706196 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.706825 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=1.706810648 podStartE2EDuration="1.706810648s" podCreationTimestamp="2026-03-21 03:49:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 03:49:56.703771746 +0000 UTC m=+229.180840538" watchObservedRunningTime="2026-03-21 03:49:56.706810648 +0000 UTC m=+229.183879440" Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.713824 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tb9dl" Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.724615 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=1.724599317 podStartE2EDuration="1.724599317s" podCreationTimestamp="2026-03-21 03:49:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 03:49:56.721889015 +0000 UTC m=+229.198957807" watchObservedRunningTime="2026-03-21 03:49:56.724599317 +0000 UTC m=+229.201668109" Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.744471 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7c5b579c85-zbdrh" podStartSLOduration=3.744455399 podStartE2EDuration="3.744455399s" podCreationTimestamp="2026-03-21 03:49:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 03:49:56.740406516 +0000 UTC m=+229.217475308" watchObservedRunningTime="2026-03-21 03:49:56.744455399 +0000 UTC m=+229.221524191" Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.765589 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-2pgbl" podStartSLOduration=13.765571899 podStartE2EDuration="13.765571899s" podCreationTimestamp="2026-03-21 03:49:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 03:49:56.759464254 +0000 UTC m=+229.236533046" watchObservedRunningTime="2026-03-21 03:49:56.765571899 +0000 UTC m=+229.242640691" Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.805217 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" podStartSLOduration=186.805194349 podStartE2EDuration="3m6.805194349s" podCreationTimestamp="2026-03-21 03:46:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 03:49:56.804729505 +0000 UTC m=+229.281798297" watchObservedRunningTime="2026-03-21 03:49:56.805194349 +0000 UTC m=+229.282263141" Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.811036 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-cf75f487f-fj8jm" Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.939619 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-lnbh8"] Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.940941 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lnbh8" Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.948010 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-pcwhp" Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.948617 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 21 03:49:56 crc kubenswrapper[4685]: I0321 03:49:56.949090 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lnbh8"] Mar 21 03:49:57 crc kubenswrapper[4685]: I0321 03:49:57.120914 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wm8t5\" (UniqueName: \"kubernetes.io/projected/4d6c412c-bc35-4360-91b0-06f8b60e7106-kube-api-access-wm8t5\") pod \"redhat-operators-lnbh8\" (UID: \"4d6c412c-bc35-4360-91b0-06f8b60e7106\") " pod="openshift-marketplace/redhat-operators-lnbh8" Mar 21 03:49:57 crc kubenswrapper[4685]: I0321 03:49:57.121288 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d6c412c-bc35-4360-91b0-06f8b60e7106-catalog-content\") pod \"redhat-operators-lnbh8\" (UID: \"4d6c412c-bc35-4360-91b0-06f8b60e7106\") " pod="openshift-marketplace/redhat-operators-lnbh8" Mar 21 03:49:57 crc kubenswrapper[4685]: I0321 03:49:57.121555 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d6c412c-bc35-4360-91b0-06f8b60e7106-utilities\") pod \"redhat-operators-lnbh8\" (UID: \"4d6c412c-bc35-4360-91b0-06f8b60e7106\") " pod="openshift-marketplace/redhat-operators-lnbh8" Mar 21 03:49:57 crc kubenswrapper[4685]: I0321 03:49:57.222423 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wm8t5\" (UniqueName: \"kubernetes.io/projected/4d6c412c-bc35-4360-91b0-06f8b60e7106-kube-api-access-wm8t5\") pod \"redhat-operators-lnbh8\" (UID: \"4d6c412c-bc35-4360-91b0-06f8b60e7106\") " pod="openshift-marketplace/redhat-operators-lnbh8" Mar 21 03:49:57 crc kubenswrapper[4685]: I0321 03:49:57.222475 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d6c412c-bc35-4360-91b0-06f8b60e7106-catalog-content\") pod \"redhat-operators-lnbh8\" (UID: \"4d6c412c-bc35-4360-91b0-06f8b60e7106\") " pod="openshift-marketplace/redhat-operators-lnbh8" Mar 21 03:49:57 crc kubenswrapper[4685]: I0321 03:49:57.222535 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d6c412c-bc35-4360-91b0-06f8b60e7106-utilities\") pod \"redhat-operators-lnbh8\" (UID: \"4d6c412c-bc35-4360-91b0-06f8b60e7106\") " pod="openshift-marketplace/redhat-operators-lnbh8" Mar 21 03:49:57 crc kubenswrapper[4685]: I0321 03:49:57.223156 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d6c412c-bc35-4360-91b0-06f8b60e7106-utilities\") pod \"redhat-operators-lnbh8\" (UID: \"4d6c412c-bc35-4360-91b0-06f8b60e7106\") " pod="openshift-marketplace/redhat-operators-lnbh8" Mar 21 03:49:57 crc kubenswrapper[4685]: I0321 03:49:57.223365 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d6c412c-bc35-4360-91b0-06f8b60e7106-catalog-content\") pod \"redhat-operators-lnbh8\" (UID: \"4d6c412c-bc35-4360-91b0-06f8b60e7106\") " pod="openshift-marketplace/redhat-operators-lnbh8" Mar 21 03:49:57 crc kubenswrapper[4685]: I0321 03:49:57.243687 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wm8t5\" (UniqueName: \"kubernetes.io/projected/4d6c412c-bc35-4360-91b0-06f8b60e7106-kube-api-access-wm8t5\") pod \"redhat-operators-lnbh8\" (UID: \"4d6c412c-bc35-4360-91b0-06f8b60e7106\") " pod="openshift-marketplace/redhat-operators-lnbh8" Mar 21 03:49:57 crc kubenswrapper[4685]: I0321 03:49:57.302872 4685 patch_prober.go:28] interesting pod/router-default-5444994796-vds97 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 03:49:57 crc kubenswrapper[4685]: [-]has-synced failed: reason withheld Mar 21 03:49:57 crc kubenswrapper[4685]: [+]process-running ok Mar 21 03:49:57 crc kubenswrapper[4685]: healthz check failed Mar 21 03:49:57 crc kubenswrapper[4685]: I0321 03:49:57.302950 4685 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vds97" podUID="b13d8427-dcef-4925-92b6-0e6bf1aca8c8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 03:49:57 crc kubenswrapper[4685]: I0321 03:49:57.308042 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zrnqq"] Mar 21 03:49:57 crc kubenswrapper[4685]: W0321 03:49:57.315797 4685 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0e0a768_d2ec_4986_8cc7_72f0bd1d285a.slice/crio-d82770b3325a3670af07adf62156d88411d0f79ae22ac300c93532c72f76106a WatchSource:0}: Error finding container d82770b3325a3670af07adf62156d88411d0f79ae22ac300c93532c72f76106a: Status 404 returned error can't find the container with id d82770b3325a3670af07adf62156d88411d0f79ae22ac300c93532c72f76106a Mar 21 03:49:57 crc kubenswrapper[4685]: I0321 03:49:57.324524 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8qq2k"] Mar 21 03:49:57 crc kubenswrapper[4685]: I0321 03:49:57.325489 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8qq2k" Mar 21 03:49:57 crc kubenswrapper[4685]: I0321 03:49:57.334118 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8qq2k"] Mar 21 03:49:57 crc kubenswrapper[4685]: I0321 03:49:57.353108 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lnbh8" Mar 21 03:49:57 crc kubenswrapper[4685]: I0321 03:49:57.372911 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-cf75f487f-fj8jm"] Mar 21 03:49:57 crc kubenswrapper[4685]: W0321 03:49:57.407295 4685 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98f91583_6a92_4490_96d0_a15aa61f30bb.slice/crio-26d9cba37f46a5cbcc45279b257379996cc49b7a40ccd4484d0b6bf5ca73b867 WatchSource:0}: Error finding container 26d9cba37f46a5cbcc45279b257379996cc49b7a40ccd4484d0b6bf5ca73b867: Status 404 returned error can't find the container with id 26d9cba37f46a5cbcc45279b257379996cc49b7a40ccd4484d0b6bf5ca73b867 Mar 21 03:49:57 crc kubenswrapper[4685]: I0321 03:49:57.424922 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/855b8c82-585a-4883-acdc-195377b480c2-utilities\") pod \"redhat-operators-8qq2k\" (UID: \"855b8c82-585a-4883-acdc-195377b480c2\") " pod="openshift-marketplace/redhat-operators-8qq2k" Mar 21 03:49:57 crc kubenswrapper[4685]: I0321 03:49:57.425073 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kgmg\" (UniqueName: \"kubernetes.io/projected/855b8c82-585a-4883-acdc-195377b480c2-kube-api-access-6kgmg\") pod \"redhat-operators-8qq2k\" (UID: \"855b8c82-585a-4883-acdc-195377b480c2\") " pod="openshift-marketplace/redhat-operators-8qq2k" Mar 21 03:49:57 crc kubenswrapper[4685]: I0321 03:49:57.425162 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/855b8c82-585a-4883-acdc-195377b480c2-catalog-content\") pod \"redhat-operators-8qq2k\" (UID: \"855b8c82-585a-4883-acdc-195377b480c2\") " pod="openshift-marketplace/redhat-operators-8qq2k" Mar 21 03:49:57 crc kubenswrapper[4685]: I0321 03:49:57.527340 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kgmg\" (UniqueName: \"kubernetes.io/projected/855b8c82-585a-4883-acdc-195377b480c2-kube-api-access-6kgmg\") pod \"redhat-operators-8qq2k\" (UID: \"855b8c82-585a-4883-acdc-195377b480c2\") " pod="openshift-marketplace/redhat-operators-8qq2k" Mar 21 03:49:57 crc kubenswrapper[4685]: I0321 03:49:57.527424 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/855b8c82-585a-4883-acdc-195377b480c2-catalog-content\") pod \"redhat-operators-8qq2k\" (UID: \"855b8c82-585a-4883-acdc-195377b480c2\") " pod="openshift-marketplace/redhat-operators-8qq2k" Mar 21 03:49:57 crc kubenswrapper[4685]: I0321 03:49:57.527496 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/855b8c82-585a-4883-acdc-195377b480c2-utilities\") pod \"redhat-operators-8qq2k\" (UID: \"855b8c82-585a-4883-acdc-195377b480c2\") " pod="openshift-marketplace/redhat-operators-8qq2k" Mar 21 03:49:57 crc kubenswrapper[4685]: I0321 03:49:57.528299 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/855b8c82-585a-4883-acdc-195377b480c2-utilities\") pod \"redhat-operators-8qq2k\" (UID: \"855b8c82-585a-4883-acdc-195377b480c2\") " pod="openshift-marketplace/redhat-operators-8qq2k" Mar 21 03:49:57 crc kubenswrapper[4685]: I0321 03:49:57.533995 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/855b8c82-585a-4883-acdc-195377b480c2-catalog-content\") pod \"redhat-operators-8qq2k\" (UID: \"855b8c82-585a-4883-acdc-195377b480c2\") " pod="openshift-marketplace/redhat-operators-8qq2k" Mar 21 03:49:57 crc kubenswrapper[4685]: I0321 03:49:57.547775 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kgmg\" (UniqueName: \"kubernetes.io/projected/855b8c82-585a-4883-acdc-195377b480c2-kube-api-access-6kgmg\") pod \"redhat-operators-8qq2k\" (UID: \"855b8c82-585a-4883-acdc-195377b480c2\") " pod="openshift-marketplace/redhat-operators-8qq2k" Mar 21 03:49:57 crc kubenswrapper[4685]: I0321 03:49:57.625716 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lnbh8"] Mar 21 03:49:57 crc kubenswrapper[4685]: I0321 03:49:57.645796 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8qq2k" Mar 21 03:49:57 crc kubenswrapper[4685]: I0321 03:49:57.720330 4685 generic.go:334] "Generic (PLEG): container finished" podID="f0e0a768-d2ec-4986-8cc7-72f0bd1d285a" containerID="417a39165b63bc101bfa9471132fb58a4c884a0229b4c6c2c08111c875b2e605" exitCode=0 Mar 21 03:49:57 crc kubenswrapper[4685]: I0321 03:49:57.720431 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zrnqq" event={"ID":"f0e0a768-d2ec-4986-8cc7-72f0bd1d285a","Type":"ContainerDied","Data":"417a39165b63bc101bfa9471132fb58a4c884a0229b4c6c2c08111c875b2e605"} Mar 21 03:49:57 crc kubenswrapper[4685]: I0321 03:49:57.720756 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zrnqq" event={"ID":"f0e0a768-d2ec-4986-8cc7-72f0bd1d285a","Type":"ContainerStarted","Data":"d82770b3325a3670af07adf62156d88411d0f79ae22ac300c93532c72f76106a"} Mar 21 03:49:57 crc kubenswrapper[4685]: I0321 03:49:57.722879 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-cf75f487f-fj8jm" event={"ID":"98f91583-6a92-4490-96d0-a15aa61f30bb","Type":"ContainerStarted","Data":"26d9cba37f46a5cbcc45279b257379996cc49b7a40ccd4484d0b6bf5ca73b867"} Mar 21 03:49:57 crc kubenswrapper[4685]: I0321 03:49:57.724730 4685 generic.go:334] "Generic (PLEG): container finished" podID="9c1f4e4f-a993-423e-8922-d8b81967d483" containerID="6ecb8492ed241704c3e9700021f409ab933f5a648a7549ec3abda63dab206bf8" exitCode=0 Mar 21 03:49:57 crc kubenswrapper[4685]: I0321 03:49:57.724811 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s6khh" event={"ID":"9c1f4e4f-a993-423e-8922-d8b81967d483","Type":"ContainerDied","Data":"6ecb8492ed241704c3e9700021f409ab933f5a648a7549ec3abda63dab206bf8"} Mar 21 03:49:57 crc kubenswrapper[4685]: I0321 03:49:57.724866 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s6khh" event={"ID":"9c1f4e4f-a993-423e-8922-d8b81967d483","Type":"ContainerStarted","Data":"b73d0a9e450c9d0b42d69d2692d9c06def486d16557e10ec98273207144ee2d9"} Mar 21 03:49:57 crc kubenswrapper[4685]: I0321 03:49:57.727746 4685 generic.go:334] "Generic (PLEG): container finished" podID="12290feb-3667-4dd9-9351-0865b9b9757e" containerID="4cd22b39e28d2a4b604aa28a688a3572f38f0e4a6f408b9de41e98ea1fe4e0b0" exitCode=0 Mar 21 03:49:57 crc kubenswrapper[4685]: I0321 03:49:57.727884 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"12290feb-3667-4dd9-9351-0865b9b9757e","Type":"ContainerDied","Data":"4cd22b39e28d2a4b604aa28a688a3572f38f0e4a6f408b9de41e98ea1fe4e0b0"} Mar 21 03:49:57 crc kubenswrapper[4685]: I0321 03:49:57.743701 4685 generic.go:334] "Generic (PLEG): container finished" podID="2718979f-2e99-4a55-a1b3-71c8b79dd1cb" containerID="a39bf4d23469ce525698bc66ae5b05447e4776ca32e6f0d539bcc603ef1a0933" exitCode=0 Mar 21 03:49:57 crc kubenswrapper[4685]: I0321 03:49:57.743871 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"2718979f-2e99-4a55-a1b3-71c8b79dd1cb","Type":"ContainerDied","Data":"a39bf4d23469ce525698bc66ae5b05447e4776ca32e6f0d539bcc603ef1a0933"} Mar 21 03:49:57 crc kubenswrapper[4685]: I0321 03:49:57.936332 4685 ???:1] "http: TLS handshake error from 192.168.126.11:54160: no serving certificate available for the kubelet" Mar 21 03:49:58 crc kubenswrapper[4685]: I0321 03:49:58.303367 4685 patch_prober.go:28] interesting pod/router-default-5444994796-vds97 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 03:49:58 crc kubenswrapper[4685]: [-]has-synced failed: reason withheld Mar 21 03:49:58 crc kubenswrapper[4685]: [+]process-running ok Mar 21 03:49:58 crc kubenswrapper[4685]: healthz check failed Mar 21 03:49:58 crc kubenswrapper[4685]: I0321 03:49:58.303416 4685 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vds97" podUID="b13d8427-dcef-4925-92b6-0e6bf1aca8c8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 03:49:59 crc kubenswrapper[4685]: I0321 03:49:59.301509 4685 patch_prober.go:28] interesting pod/router-default-5444994796-vds97 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 03:49:59 crc kubenswrapper[4685]: [-]has-synced failed: reason withheld Mar 21 03:49:59 crc kubenswrapper[4685]: [+]process-running ok Mar 21 03:49:59 crc kubenswrapper[4685]: healthz check failed Mar 21 03:49:59 crc kubenswrapper[4685]: I0321 03:49:59.302060 4685 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vds97" podUID="b13d8427-dcef-4925-92b6-0e6bf1aca8c8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 03:50:00 crc kubenswrapper[4685]: I0321 03:50:00.129883 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567750-hlpdj"] Mar 21 03:50:00 crc kubenswrapper[4685]: I0321 03:50:00.132558 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567750-hlpdj" Mar 21 03:50:00 crc kubenswrapper[4685]: I0321 03:50:00.135753 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k75cc" Mar 21 03:50:00 crc kubenswrapper[4685]: I0321 03:50:00.136271 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567750-hlpdj"] Mar 21 03:50:00 crc kubenswrapper[4685]: I0321 03:50:00.244919 4685 ???:1] "http: TLS handshake error from 192.168.126.11:54174: no serving certificate available for the kubelet" Mar 21 03:50:00 crc kubenswrapper[4685]: I0321 03:50:00.296329 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p672f\" (UniqueName: \"kubernetes.io/projected/2a68d63c-113c-4421-9444-78d05d636874-kube-api-access-p672f\") pod \"auto-csr-approver-29567750-hlpdj\" (UID: \"2a68d63c-113c-4421-9444-78d05d636874\") " pod="openshift-infra/auto-csr-approver-29567750-hlpdj" Mar 21 03:50:00 crc kubenswrapper[4685]: I0321 03:50:00.301348 4685 patch_prober.go:28] interesting pod/router-default-5444994796-vds97 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 03:50:00 crc kubenswrapper[4685]: [-]has-synced failed: reason withheld Mar 21 03:50:00 crc kubenswrapper[4685]: [+]process-running ok Mar 21 03:50:00 crc kubenswrapper[4685]: healthz check failed Mar 21 03:50:00 crc kubenswrapper[4685]: I0321 03:50:00.301402 4685 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vds97" podUID="b13d8427-dcef-4925-92b6-0e6bf1aca8c8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 03:50:00 crc kubenswrapper[4685]: I0321 03:50:00.317924 4685 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-864h4" Mar 21 03:50:00 crc kubenswrapper[4685]: I0321 03:50:00.322888 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-864h4" Mar 21 03:50:00 crc kubenswrapper[4685]: I0321 03:50:00.397119 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p672f\" (UniqueName: \"kubernetes.io/projected/2a68d63c-113c-4421-9444-78d05d636874-kube-api-access-p672f\") pod \"auto-csr-approver-29567750-hlpdj\" (UID: \"2a68d63c-113c-4421-9444-78d05d636874\") " pod="openshift-infra/auto-csr-approver-29567750-hlpdj" Mar 21 03:50:00 crc kubenswrapper[4685]: I0321 03:50:00.445915 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p672f\" (UniqueName: \"kubernetes.io/projected/2a68d63c-113c-4421-9444-78d05d636874-kube-api-access-p672f\") pod \"auto-csr-approver-29567750-hlpdj\" (UID: \"2a68d63c-113c-4421-9444-78d05d636874\") " pod="openshift-infra/auto-csr-approver-29567750-hlpdj" Mar 21 03:50:00 crc kubenswrapper[4685]: I0321 03:50:00.509741 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567750-hlpdj" Mar 21 03:50:01 crc kubenswrapper[4685]: I0321 03:50:01.302088 4685 patch_prober.go:28] interesting pod/router-default-5444994796-vds97 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 03:50:01 crc kubenswrapper[4685]: [-]has-synced failed: reason withheld Mar 21 03:50:01 crc kubenswrapper[4685]: [+]process-running ok Mar 21 03:50:01 crc kubenswrapper[4685]: healthz check failed Mar 21 03:50:01 crc kubenswrapper[4685]: I0321 03:50:01.302413 4685 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vds97" podUID="b13d8427-dcef-4925-92b6-0e6bf1aca8c8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 03:50:01 crc kubenswrapper[4685]: I0321 03:50:01.659448 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-hncrq" Mar 21 03:50:01 crc kubenswrapper[4685]: W0321 03:50:01.831059 4685 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d6c412c_bc35_4360_91b0_06f8b60e7106.slice/crio-9e5b9e6ab7081e16fb1172c0d1f9dee7bde06a7cd002e2c62ddefbefc368a5f3 WatchSource:0}: Error finding container 9e5b9e6ab7081e16fb1172c0d1f9dee7bde06a7cd002e2c62ddefbefc368a5f3: Status 404 returned error can't find the container with id 9e5b9e6ab7081e16fb1172c0d1f9dee7bde06a7cd002e2c62ddefbefc368a5f3 Mar 21 03:50:01 crc kubenswrapper[4685]: I0321 03:50:01.903609 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 21 03:50:01 crc kubenswrapper[4685]: I0321 03:50:01.911127 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 21 03:50:01 crc kubenswrapper[4685]: I0321 03:50:01.923478 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567745-czvxj" Mar 21 03:50:02 crc kubenswrapper[4685]: I0321 03:50:02.020097 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/12290feb-3667-4dd9-9351-0865b9b9757e-kube-api-access\") pod \"12290feb-3667-4dd9-9351-0865b9b9757e\" (UID: \"12290feb-3667-4dd9-9351-0865b9b9757e\") " Mar 21 03:50:02 crc kubenswrapper[4685]: I0321 03:50:02.020182 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/12290feb-3667-4dd9-9351-0865b9b9757e-kubelet-dir\") pod \"12290feb-3667-4dd9-9351-0865b9b9757e\" (UID: \"12290feb-3667-4dd9-9351-0865b9b9757e\") " Mar 21 03:50:02 crc kubenswrapper[4685]: I0321 03:50:02.020211 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2718979f-2e99-4a55-a1b3-71c8b79dd1cb-kube-api-access\") pod \"2718979f-2e99-4a55-a1b3-71c8b79dd1cb\" (UID: \"2718979f-2e99-4a55-a1b3-71c8b79dd1cb\") " Mar 21 03:50:02 crc kubenswrapper[4685]: I0321 03:50:02.020243 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2718979f-2e99-4a55-a1b3-71c8b79dd1cb-kubelet-dir\") pod \"2718979f-2e99-4a55-a1b3-71c8b79dd1cb\" (UID: \"2718979f-2e99-4a55-a1b3-71c8b79dd1cb\") " Mar 21 03:50:02 crc kubenswrapper[4685]: I0321 03:50:02.020531 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2718979f-2e99-4a55-a1b3-71c8b79dd1cb-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "2718979f-2e99-4a55-a1b3-71c8b79dd1cb" (UID: "2718979f-2e99-4a55-a1b3-71c8b79dd1cb"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 03:50:02 crc kubenswrapper[4685]: I0321 03:50:02.020967 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/12290feb-3667-4dd9-9351-0865b9b9757e-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "12290feb-3667-4dd9-9351-0865b9b9757e" (UID: "12290feb-3667-4dd9-9351-0865b9b9757e"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 03:50:02 crc kubenswrapper[4685]: I0321 03:50:02.031078 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2718979f-2e99-4a55-a1b3-71c8b79dd1cb-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "2718979f-2e99-4a55-a1b3-71c8b79dd1cb" (UID: "2718979f-2e99-4a55-a1b3-71c8b79dd1cb"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:50:02 crc kubenswrapper[4685]: I0321 03:50:02.031958 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12290feb-3667-4dd9-9351-0865b9b9757e-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "12290feb-3667-4dd9-9351-0865b9b9757e" (UID: "12290feb-3667-4dd9-9351-0865b9b9757e"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:50:02 crc kubenswrapper[4685]: I0321 03:50:02.121778 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mv8pp\" (UniqueName: \"kubernetes.io/projected/0306bf0e-f0c1-4e47-b63e-909b979c5844-kube-api-access-mv8pp\") pod \"0306bf0e-f0c1-4e47-b63e-909b979c5844\" (UID: \"0306bf0e-f0c1-4e47-b63e-909b979c5844\") " Mar 21 03:50:02 crc kubenswrapper[4685]: I0321 03:50:02.121830 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0306bf0e-f0c1-4e47-b63e-909b979c5844-config-volume\") pod \"0306bf0e-f0c1-4e47-b63e-909b979c5844\" (UID: \"0306bf0e-f0c1-4e47-b63e-909b979c5844\") " Mar 21 03:50:02 crc kubenswrapper[4685]: I0321 03:50:02.121880 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0306bf0e-f0c1-4e47-b63e-909b979c5844-secret-volume\") pod \"0306bf0e-f0c1-4e47-b63e-909b979c5844\" (UID: \"0306bf0e-f0c1-4e47-b63e-909b979c5844\") " Mar 21 03:50:02 crc kubenswrapper[4685]: I0321 03:50:02.122106 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/12290feb-3667-4dd9-9351-0865b9b9757e-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 21 03:50:02 crc kubenswrapper[4685]: I0321 03:50:02.122123 4685 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/12290feb-3667-4dd9-9351-0865b9b9757e-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 21 03:50:02 crc kubenswrapper[4685]: I0321 03:50:02.122131 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2718979f-2e99-4a55-a1b3-71c8b79dd1cb-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 21 03:50:02 crc kubenswrapper[4685]: I0321 03:50:02.122141 4685 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2718979f-2e99-4a55-a1b3-71c8b79dd1cb-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 21 03:50:02 crc kubenswrapper[4685]: I0321 03:50:02.123496 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0306bf0e-f0c1-4e47-b63e-909b979c5844-config-volume" (OuterVolumeSpecName: "config-volume") pod "0306bf0e-f0c1-4e47-b63e-909b979c5844" (UID: "0306bf0e-f0c1-4e47-b63e-909b979c5844"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:50:02 crc kubenswrapper[4685]: I0321 03:50:02.141424 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0306bf0e-f0c1-4e47-b63e-909b979c5844-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "0306bf0e-f0c1-4e47-b63e-909b979c5844" (UID: "0306bf0e-f0c1-4e47-b63e-909b979c5844"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 03:50:02 crc kubenswrapper[4685]: I0321 03:50:02.142283 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0306bf0e-f0c1-4e47-b63e-909b979c5844-kube-api-access-mv8pp" (OuterVolumeSpecName: "kube-api-access-mv8pp") pod "0306bf0e-f0c1-4e47-b63e-909b979c5844" (UID: "0306bf0e-f0c1-4e47-b63e-909b979c5844"). InnerVolumeSpecName "kube-api-access-mv8pp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:50:02 crc kubenswrapper[4685]: I0321 03:50:02.223607 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mv8pp\" (UniqueName: \"kubernetes.io/projected/0306bf0e-f0c1-4e47-b63e-909b979c5844-kube-api-access-mv8pp\") on node \"crc\" DevicePath \"\"" Mar 21 03:50:02 crc kubenswrapper[4685]: I0321 03:50:02.223647 4685 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0306bf0e-f0c1-4e47-b63e-909b979c5844-config-volume\") on node \"crc\" DevicePath \"\"" Mar 21 03:50:02 crc kubenswrapper[4685]: I0321 03:50:02.223657 4685 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0306bf0e-f0c1-4e47-b63e-909b979c5844-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 21 03:50:02 crc kubenswrapper[4685]: I0321 03:50:02.301366 4685 patch_prober.go:28] interesting pod/router-default-5444994796-vds97 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 03:50:02 crc kubenswrapper[4685]: [-]has-synced failed: reason withheld Mar 21 03:50:02 crc kubenswrapper[4685]: [+]process-running ok Mar 21 03:50:02 crc kubenswrapper[4685]: healthz check failed Mar 21 03:50:02 crc kubenswrapper[4685]: I0321 03:50:02.301420 4685 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vds97" podUID="b13d8427-dcef-4925-92b6-0e6bf1aca8c8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 03:50:02 crc kubenswrapper[4685]: I0321 03:50:02.785361 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lnbh8" event={"ID":"4d6c412c-bc35-4360-91b0-06f8b60e7106","Type":"ContainerStarted","Data":"9e5b9e6ab7081e16fb1172c0d1f9dee7bde06a7cd002e2c62ddefbefc368a5f3"} Mar 21 03:50:02 crc kubenswrapper[4685]: I0321 03:50:02.786803 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567745-czvxj" event={"ID":"0306bf0e-f0c1-4e47-b63e-909b979c5844","Type":"ContainerDied","Data":"d892bfba09ee863f547cc620c295ae4e415a18c4f06c302e67c4123329179220"} Mar 21 03:50:02 crc kubenswrapper[4685]: I0321 03:50:02.786831 4685 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d892bfba09ee863f547cc620c295ae4e415a18c4f06c302e67c4123329179220" Mar 21 03:50:02 crc kubenswrapper[4685]: I0321 03:50:02.786928 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567745-czvxj" Mar 21 03:50:02 crc kubenswrapper[4685]: I0321 03:50:02.790198 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"12290feb-3667-4dd9-9351-0865b9b9757e","Type":"ContainerDied","Data":"05137d5071072a23cdbbf9eb81306a0d3344a3ad55b1f2867d69a49800cdd318"} Mar 21 03:50:02 crc kubenswrapper[4685]: I0321 03:50:02.790231 4685 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="05137d5071072a23cdbbf9eb81306a0d3344a3ad55b1f2867d69a49800cdd318" Mar 21 03:50:02 crc kubenswrapper[4685]: I0321 03:50:02.790344 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 21 03:50:02 crc kubenswrapper[4685]: I0321 03:50:02.795283 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"2718979f-2e99-4a55-a1b3-71c8b79dd1cb","Type":"ContainerDied","Data":"a562324d6d8ea9a650e26f76471f13f676a133ec7c2048c4d11a03d7f371c895"} Mar 21 03:50:02 crc kubenswrapper[4685]: I0321 03:50:02.795312 4685 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a562324d6d8ea9a650e26f76471f13f676a133ec7c2048c4d11a03d7f371c895" Mar 21 03:50:02 crc kubenswrapper[4685]: I0321 03:50:02.795370 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 21 03:50:03 crc kubenswrapper[4685]: I0321 03:50:03.302505 4685 patch_prober.go:28] interesting pod/router-default-5444994796-vds97 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 03:50:03 crc kubenswrapper[4685]: [-]has-synced failed: reason withheld Mar 21 03:50:03 crc kubenswrapper[4685]: [+]process-running ok Mar 21 03:50:03 crc kubenswrapper[4685]: healthz check failed Mar 21 03:50:03 crc kubenswrapper[4685]: I0321 03:50:03.302567 4685 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vds97" podUID="b13d8427-dcef-4925-92b6-0e6bf1aca8c8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 03:50:04 crc kubenswrapper[4685]: I0321 03:50:04.300240 4685 patch_prober.go:28] interesting pod/router-default-5444994796-vds97 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 03:50:04 crc kubenswrapper[4685]: [-]has-synced failed: reason withheld Mar 21 03:50:04 crc kubenswrapper[4685]: [+]process-running ok Mar 21 03:50:04 crc kubenswrapper[4685]: healthz check failed Mar 21 03:50:04 crc kubenswrapper[4685]: I0321 03:50:04.300299 4685 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vds97" podUID="b13d8427-dcef-4925-92b6-0e6bf1aca8c8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 03:50:05 crc kubenswrapper[4685]: I0321 03:50:05.300931 4685 patch_prober.go:28] interesting pod/router-default-5444994796-vds97 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 03:50:05 crc kubenswrapper[4685]: [-]has-synced failed: reason withheld Mar 21 03:50:05 crc kubenswrapper[4685]: [+]process-running ok Mar 21 03:50:05 crc kubenswrapper[4685]: healthz check failed Mar 21 03:50:05 crc kubenswrapper[4685]: I0321 03:50:05.301350 4685 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vds97" podUID="b13d8427-dcef-4925-92b6-0e6bf1aca8c8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 03:50:06 crc kubenswrapper[4685]: I0321 03:50:06.204700 4685 patch_prober.go:28] interesting pod/console-f9d7485db-j8qrk container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.37:8443/health\": dial tcp 10.217.0.37:8443: connect: connection refused" start-of-body= Mar 21 03:50:06 crc kubenswrapper[4685]: I0321 03:50:06.204952 4685 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-j8qrk" podUID="49f645cb-7805-4ded-9f3e-d43bdb3801a6" containerName="console" probeResult="failure" output="Get \"https://10.217.0.37:8443/health\": dial tcp 10.217.0.37:8443: connect: connection refused" Mar 21 03:50:06 crc kubenswrapper[4685]: I0321 03:50:06.261635 4685 patch_prober.go:28] interesting pod/downloads-7954f5f757-clz2m container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.35:8080/\": dial tcp 10.217.0.35:8080: connect: connection refused" start-of-body= Mar 21 03:50:06 crc kubenswrapper[4685]: I0321 03:50:06.261663 4685 patch_prober.go:28] interesting pod/downloads-7954f5f757-clz2m container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.35:8080/\": dial tcp 10.217.0.35:8080: connect: connection refused" start-of-body= Mar 21 03:50:06 crc kubenswrapper[4685]: I0321 03:50:06.261691 4685 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-clz2m" podUID="eb165aaf-36a6-4965-bebf-6a40e1695b94" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.35:8080/\": dial tcp 10.217.0.35:8080: connect: connection refused" Mar 21 03:50:06 crc kubenswrapper[4685]: I0321 03:50:06.261701 4685 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-clz2m" podUID="eb165aaf-36a6-4965-bebf-6a40e1695b94" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.35:8080/\": dial tcp 10.217.0.35:8080: connect: connection refused" Mar 21 03:50:06 crc kubenswrapper[4685]: I0321 03:50:06.307295 4685 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-vds97" Mar 21 03:50:06 crc kubenswrapper[4685]: I0321 03:50:06.311728 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-vds97" Mar 21 03:50:07 crc kubenswrapper[4685]: E0321 03:50:07.847026 4685 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 21 03:50:07 crc kubenswrapper[4685]: E0321 03:50:07.847194 4685 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 21 03:50:07 crc kubenswrapper[4685]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 21 03:50:07 crc kubenswrapper[4685]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kgshs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29567748-zv7h8_openshift-infra(4ede3f08-f29b-4cb9-a96f-1c66239498f6): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 21 03:50:07 crc kubenswrapper[4685]: > logger="UnhandledError" Mar 21 03:50:07 crc kubenswrapper[4685]: E0321 03:50:07.848983 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29567748-zv7h8" podUID="4ede3f08-f29b-4cb9-a96f-1c66239498f6" Mar 21 03:50:08 crc kubenswrapper[4685]: I0321 03:50:08.198272 4685 ???:1] "http: TLS handshake error from 192.168.126.11:42836: no serving certificate available for the kubelet" Mar 21 03:50:08 crc kubenswrapper[4685]: E0321 03:50:08.826475 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29567748-zv7h8" podUID="4ede3f08-f29b-4cb9-a96f-1c66239498f6" Mar 21 03:50:09 crc kubenswrapper[4685]: I0321 03:50:09.684979 4685 patch_prober.go:28] interesting pod/machine-config-daemon-7r9cg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 03:50:09 crc kubenswrapper[4685]: I0321 03:50:09.685446 4685 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" podUID="cea46fe2-4e41-43ab-a069-cb30fb4e732c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 03:50:12 crc kubenswrapper[4685]: I0321 03:50:12.555684 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-cf75f487f-fj8jm"] Mar 21 03:50:12 crc kubenswrapper[4685]: I0321 03:50:12.579792 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c5b579c85-zbdrh"] Mar 21 03:50:12 crc kubenswrapper[4685]: I0321 03:50:12.580060 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7c5b579c85-zbdrh" podUID="b246afbf-fbd5-4f39-9c95-08aab014dda0" containerName="route-controller-manager" containerID="cri-o://345685d8579eeb045c1610555ab08ac62f971fc6f14df13865bf2151b3d2739e" gracePeriod=30 Mar 21 03:50:14 crc kubenswrapper[4685]: I0321 03:50:14.260408 4685 generic.go:334] "Generic (PLEG): container finished" podID="b246afbf-fbd5-4f39-9c95-08aab014dda0" containerID="345685d8579eeb045c1610555ab08ac62f971fc6f14df13865bf2151b3d2739e" exitCode=0 Mar 21 03:50:14 crc kubenswrapper[4685]: I0321 03:50:14.260449 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7c5b579c85-zbdrh" event={"ID":"b246afbf-fbd5-4f39-9c95-08aab014dda0","Type":"ContainerDied","Data":"345685d8579eeb045c1610555ab08ac62f971fc6f14df13865bf2151b3d2739e"} Mar 21 03:50:14 crc kubenswrapper[4685]: I0321 03:50:14.773732 4685 patch_prober.go:28] interesting pod/route-controller-manager-7c5b579c85-zbdrh container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.49:8443/healthz\": dial tcp 10.217.0.49:8443: connect: connection refused" start-of-body= Mar 21 03:50:14 crc kubenswrapper[4685]: I0321 03:50:14.773777 4685 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-7c5b579c85-zbdrh" podUID="b246afbf-fbd5-4f39-9c95-08aab014dda0" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.49:8443/healthz\": dial tcp 10.217.0.49:8443: connect: connection refused" Mar 21 03:50:15 crc kubenswrapper[4685]: I0321 03:50:15.839406 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:50:16 crc kubenswrapper[4685]: I0321 03:50:16.211805 4685 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-j8qrk" Mar 21 03:50:16 crc kubenswrapper[4685]: I0321 03:50:16.217822 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-j8qrk" Mar 21 03:50:16 crc kubenswrapper[4685]: I0321 03:50:16.268761 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-clz2m" Mar 21 03:50:17 crc kubenswrapper[4685]: I0321 03:50:17.544766 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8qq2k"] Mar 21 03:50:17 crc kubenswrapper[4685]: I0321 03:50:17.585766 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567750-hlpdj"] Mar 21 03:50:19 crc kubenswrapper[4685]: I0321 03:50:19.294047 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-cf75f487f-fj8jm" event={"ID":"98f91583-6a92-4490-96d0-a15aa61f30bb","Type":"ContainerStarted","Data":"21ba54647eee116126737f67116e01f945c7bbdda2c86646d7cbb15b2bffd206"} Mar 21 03:50:23 crc kubenswrapper[4685]: I0321 03:50:23.316950 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-cf75f487f-fj8jm" podUID="98f91583-6a92-4490-96d0-a15aa61f30bb" containerName="controller-manager" containerID="cri-o://21ba54647eee116126737f67116e01f945c7bbdda2c86646d7cbb15b2bffd206" gracePeriod=30 Mar 21 03:50:23 crc kubenswrapper[4685]: I0321 03:50:23.317216 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-cf75f487f-fj8jm" Mar 21 03:50:23 crc kubenswrapper[4685]: I0321 03:50:23.324293 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-cf75f487f-fj8jm" Mar 21 03:50:23 crc kubenswrapper[4685]: I0321 03:50:23.341984 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-cf75f487f-fj8jm" podStartSLOduration=30.341969226 podStartE2EDuration="30.341969226s" podCreationTimestamp="2026-03-21 03:49:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 03:50:23.340207083 +0000 UTC m=+255.817275875" watchObservedRunningTime="2026-03-21 03:50:23.341969226 +0000 UTC m=+255.819038018" Mar 21 03:50:24 crc kubenswrapper[4685]: W0321 03:50:24.626274 4685 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a68d63c_113c_4421_9444_78d05d636874.slice/crio-e321bb85f49437c165b0c5a6b71d42d182d8a68f6c24afd740ccab510d39c3ce WatchSource:0}: Error finding container e321bb85f49437c165b0c5a6b71d42d182d8a68f6c24afd740ccab510d39c3ce: Status 404 returned error can't find the container with id e321bb85f49437c165b0c5a6b71d42d182d8a68f6c24afd740ccab510d39c3ce Mar 21 03:50:25 crc kubenswrapper[4685]: E0321 03:50:25.027299 4685 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 21 03:50:25 crc kubenswrapper[4685]: E0321 03:50:25.029305 4685 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-27mn8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-xvjs7_openshift-marketplace(53ad7ff2-a7d3-4ad6-8e97-14542e4a99cd): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 21 03:50:25 crc kubenswrapper[4685]: E0321 03:50:25.031614 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-xvjs7" podUID="53ad7ff2-a7d3-4ad6-8e97-14542e4a99cd" Mar 21 03:50:25 crc kubenswrapper[4685]: I0321 03:50:25.333956 4685 generic.go:334] "Generic (PLEG): container finished" podID="98f91583-6a92-4490-96d0-a15aa61f30bb" containerID="21ba54647eee116126737f67116e01f945c7bbdda2c86646d7cbb15b2bffd206" exitCode=0 Mar 21 03:50:25 crc kubenswrapper[4685]: I0321 03:50:25.334063 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-cf75f487f-fj8jm" event={"ID":"98f91583-6a92-4490-96d0-a15aa61f30bb","Type":"ContainerDied","Data":"21ba54647eee116126737f67116e01f945c7bbdda2c86646d7cbb15b2bffd206"} Mar 21 03:50:25 crc kubenswrapper[4685]: I0321 03:50:25.335904 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567750-hlpdj" event={"ID":"2a68d63c-113c-4421-9444-78d05d636874","Type":"ContainerStarted","Data":"e321bb85f49437c165b0c5a6b71d42d182d8a68f6c24afd740ccab510d39c3ce"} Mar 21 03:50:25 crc kubenswrapper[4685]: I0321 03:50:25.774281 4685 patch_prober.go:28] interesting pod/route-controller-manager-7c5b579c85-zbdrh container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.49:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 21 03:50:25 crc kubenswrapper[4685]: I0321 03:50:25.774377 4685 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-7c5b579c85-zbdrh" podUID="b246afbf-fbd5-4f39-9c95-08aab014dda0" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.49:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 21 03:50:26 crc kubenswrapper[4685]: I0321 03:50:26.668915 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-n8ttn" Mar 21 03:50:27 crc kubenswrapper[4685]: I0321 03:50:27.353942 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 03:50:27 crc kubenswrapper[4685]: E0321 03:50:27.478454 4685 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 21 03:50:27 crc kubenswrapper[4685]: E0321 03:50:27.478813 4685 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5zktx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-7swf2_openshift-marketplace(d4ebdf18-8426-42cc-93a6-60b46261aebe): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 21 03:50:27 crc kubenswrapper[4685]: E0321 03:50:27.480031 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-7swf2" podUID="d4ebdf18-8426-42cc-93a6-60b46261aebe" Mar 21 03:50:27 crc kubenswrapper[4685]: E0321 03:50:27.503401 4685 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 21 03:50:27 crc kubenswrapper[4685]: E0321 03:50:27.503537 4685 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jtx65,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-mhc9s_openshift-marketplace(931ed0e7-7ffb-48ba-92b0-28883a6f0b39): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 21 03:50:27 crc kubenswrapper[4685]: E0321 03:50:27.504689 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-mhc9s" podUID="931ed0e7-7ffb-48ba-92b0-28883a6f0b39" Mar 21 03:50:27 crc kubenswrapper[4685]: I0321 03:50:27.548186 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7c5b579c85-zbdrh" Mar 21 03:50:27 crc kubenswrapper[4685]: I0321 03:50:27.580476 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6cb978c9f5-7497s"] Mar 21 03:50:27 crc kubenswrapper[4685]: E0321 03:50:27.580752 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2718979f-2e99-4a55-a1b3-71c8b79dd1cb" containerName="pruner" Mar 21 03:50:27 crc kubenswrapper[4685]: I0321 03:50:27.580769 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="2718979f-2e99-4a55-a1b3-71c8b79dd1cb" containerName="pruner" Mar 21 03:50:27 crc kubenswrapper[4685]: E0321 03:50:27.580790 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b246afbf-fbd5-4f39-9c95-08aab014dda0" containerName="route-controller-manager" Mar 21 03:50:27 crc kubenswrapper[4685]: I0321 03:50:27.581197 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="b246afbf-fbd5-4f39-9c95-08aab014dda0" containerName="route-controller-manager" Mar 21 03:50:27 crc kubenswrapper[4685]: E0321 03:50:27.581241 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12290feb-3667-4dd9-9351-0865b9b9757e" containerName="pruner" Mar 21 03:50:27 crc kubenswrapper[4685]: I0321 03:50:27.581689 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="12290feb-3667-4dd9-9351-0865b9b9757e" containerName="pruner" Mar 21 03:50:27 crc kubenswrapper[4685]: E0321 03:50:27.581727 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0306bf0e-f0c1-4e47-b63e-909b979c5844" containerName="collect-profiles" Mar 21 03:50:27 crc kubenswrapper[4685]: I0321 03:50:27.581739 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="0306bf0e-f0c1-4e47-b63e-909b979c5844" containerName="collect-profiles" Mar 21 03:50:27 crc kubenswrapper[4685]: I0321 03:50:27.581969 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="b246afbf-fbd5-4f39-9c95-08aab014dda0" containerName="route-controller-manager" Mar 21 03:50:27 crc kubenswrapper[4685]: I0321 03:50:27.581997 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="12290feb-3667-4dd9-9351-0865b9b9757e" containerName="pruner" Mar 21 03:50:27 crc kubenswrapper[4685]: I0321 03:50:27.582015 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="2718979f-2e99-4a55-a1b3-71c8b79dd1cb" containerName="pruner" Mar 21 03:50:27 crc kubenswrapper[4685]: I0321 03:50:27.582027 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="0306bf0e-f0c1-4e47-b63e-909b979c5844" containerName="collect-profiles" Mar 21 03:50:27 crc kubenswrapper[4685]: I0321 03:50:27.582573 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6cb978c9f5-7497s" Mar 21 03:50:27 crc kubenswrapper[4685]: I0321 03:50:27.588206 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6cb978c9f5-7497s"] Mar 21 03:50:27 crc kubenswrapper[4685]: I0321 03:50:27.669143 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b246afbf-fbd5-4f39-9c95-08aab014dda0-client-ca\") pod \"b246afbf-fbd5-4f39-9c95-08aab014dda0\" (UID: \"b246afbf-fbd5-4f39-9c95-08aab014dda0\") " Mar 21 03:50:27 crc kubenswrapper[4685]: I0321 03:50:27.669218 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-np9mh\" (UniqueName: \"kubernetes.io/projected/b246afbf-fbd5-4f39-9c95-08aab014dda0-kube-api-access-np9mh\") pod \"b246afbf-fbd5-4f39-9c95-08aab014dda0\" (UID: \"b246afbf-fbd5-4f39-9c95-08aab014dda0\") " Mar 21 03:50:27 crc kubenswrapper[4685]: I0321 03:50:27.669266 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b246afbf-fbd5-4f39-9c95-08aab014dda0-config\") pod \"b246afbf-fbd5-4f39-9c95-08aab014dda0\" (UID: \"b246afbf-fbd5-4f39-9c95-08aab014dda0\") " Mar 21 03:50:27 crc kubenswrapper[4685]: I0321 03:50:27.669327 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b246afbf-fbd5-4f39-9c95-08aab014dda0-serving-cert\") pod \"b246afbf-fbd5-4f39-9c95-08aab014dda0\" (UID: \"b246afbf-fbd5-4f39-9c95-08aab014dda0\") " Mar 21 03:50:27 crc kubenswrapper[4685]: I0321 03:50:27.670336 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b246afbf-fbd5-4f39-9c95-08aab014dda0-config" (OuterVolumeSpecName: "config") pod "b246afbf-fbd5-4f39-9c95-08aab014dda0" (UID: "b246afbf-fbd5-4f39-9c95-08aab014dda0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:50:27 crc kubenswrapper[4685]: I0321 03:50:27.670327 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b246afbf-fbd5-4f39-9c95-08aab014dda0-client-ca" (OuterVolumeSpecName: "client-ca") pod "b246afbf-fbd5-4f39-9c95-08aab014dda0" (UID: "b246afbf-fbd5-4f39-9c95-08aab014dda0"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:50:27 crc kubenswrapper[4685]: I0321 03:50:27.670758 4685 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b246afbf-fbd5-4f39-9c95-08aab014dda0-client-ca\") on node \"crc\" DevicePath \"\"" Mar 21 03:50:27 crc kubenswrapper[4685]: I0321 03:50:27.670781 4685 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b246afbf-fbd5-4f39-9c95-08aab014dda0-config\") on node \"crc\" DevicePath \"\"" Mar 21 03:50:27 crc kubenswrapper[4685]: I0321 03:50:27.674894 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b246afbf-fbd5-4f39-9c95-08aab014dda0-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b246afbf-fbd5-4f39-9c95-08aab014dda0" (UID: "b246afbf-fbd5-4f39-9c95-08aab014dda0"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 03:50:27 crc kubenswrapper[4685]: I0321 03:50:27.675541 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b246afbf-fbd5-4f39-9c95-08aab014dda0-kube-api-access-np9mh" (OuterVolumeSpecName: "kube-api-access-np9mh") pod "b246afbf-fbd5-4f39-9c95-08aab014dda0" (UID: "b246afbf-fbd5-4f39-9c95-08aab014dda0"). InnerVolumeSpecName "kube-api-access-np9mh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:50:27 crc kubenswrapper[4685]: I0321 03:50:27.771680 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bcc97a3b-b2e8-4abe-885f-d4f79ce9a00a-client-ca\") pod \"route-controller-manager-6cb978c9f5-7497s\" (UID: \"bcc97a3b-b2e8-4abe-885f-d4f79ce9a00a\") " pod="openshift-route-controller-manager/route-controller-manager-6cb978c9f5-7497s" Mar 21 03:50:27 crc kubenswrapper[4685]: I0321 03:50:27.771761 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6r7xh\" (UniqueName: \"kubernetes.io/projected/bcc97a3b-b2e8-4abe-885f-d4f79ce9a00a-kube-api-access-6r7xh\") pod \"route-controller-manager-6cb978c9f5-7497s\" (UID: \"bcc97a3b-b2e8-4abe-885f-d4f79ce9a00a\") " pod="openshift-route-controller-manager/route-controller-manager-6cb978c9f5-7497s" Mar 21 03:50:27 crc kubenswrapper[4685]: I0321 03:50:27.771852 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bcc97a3b-b2e8-4abe-885f-d4f79ce9a00a-serving-cert\") pod \"route-controller-manager-6cb978c9f5-7497s\" (UID: \"bcc97a3b-b2e8-4abe-885f-d4f79ce9a00a\") " pod="openshift-route-controller-manager/route-controller-manager-6cb978c9f5-7497s" Mar 21 03:50:27 crc kubenswrapper[4685]: I0321 03:50:27.771872 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcc97a3b-b2e8-4abe-885f-d4f79ce9a00a-config\") pod \"route-controller-manager-6cb978c9f5-7497s\" (UID: \"bcc97a3b-b2e8-4abe-885f-d4f79ce9a00a\") " pod="openshift-route-controller-manager/route-controller-manager-6cb978c9f5-7497s" Mar 21 03:50:27 crc kubenswrapper[4685]: I0321 03:50:27.771903 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-np9mh\" (UniqueName: \"kubernetes.io/projected/b246afbf-fbd5-4f39-9c95-08aab014dda0-kube-api-access-np9mh\") on node \"crc\" DevicePath \"\"" Mar 21 03:50:27 crc kubenswrapper[4685]: I0321 03:50:27.771914 4685 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b246afbf-fbd5-4f39-9c95-08aab014dda0-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 03:50:27 crc kubenswrapper[4685]: I0321 03:50:27.813129 4685 patch_prober.go:28] interesting pod/controller-manager-cf75f487f-fj8jm container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.54:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 21 03:50:27 crc kubenswrapper[4685]: I0321 03:50:27.813209 4685 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-cf75f487f-fj8jm" podUID="98f91583-6a92-4490-96d0-a15aa61f30bb" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.54:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 21 03:50:27 crc kubenswrapper[4685]: I0321 03:50:27.873134 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bcc97a3b-b2e8-4abe-885f-d4f79ce9a00a-serving-cert\") pod \"route-controller-manager-6cb978c9f5-7497s\" (UID: \"bcc97a3b-b2e8-4abe-885f-d4f79ce9a00a\") " pod="openshift-route-controller-manager/route-controller-manager-6cb978c9f5-7497s" Mar 21 03:50:27 crc kubenswrapper[4685]: I0321 03:50:27.873198 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcc97a3b-b2e8-4abe-885f-d4f79ce9a00a-config\") pod \"route-controller-manager-6cb978c9f5-7497s\" (UID: \"bcc97a3b-b2e8-4abe-885f-d4f79ce9a00a\") " pod="openshift-route-controller-manager/route-controller-manager-6cb978c9f5-7497s" Mar 21 03:50:27 crc kubenswrapper[4685]: I0321 03:50:27.873247 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bcc97a3b-b2e8-4abe-885f-d4f79ce9a00a-client-ca\") pod \"route-controller-manager-6cb978c9f5-7497s\" (UID: \"bcc97a3b-b2e8-4abe-885f-d4f79ce9a00a\") " pod="openshift-route-controller-manager/route-controller-manager-6cb978c9f5-7497s" Mar 21 03:50:27 crc kubenswrapper[4685]: I0321 03:50:27.873479 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6r7xh\" (UniqueName: \"kubernetes.io/projected/bcc97a3b-b2e8-4abe-885f-d4f79ce9a00a-kube-api-access-6r7xh\") pod \"route-controller-manager-6cb978c9f5-7497s\" (UID: \"bcc97a3b-b2e8-4abe-885f-d4f79ce9a00a\") " pod="openshift-route-controller-manager/route-controller-manager-6cb978c9f5-7497s" Mar 21 03:50:27 crc kubenswrapper[4685]: I0321 03:50:27.875975 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcc97a3b-b2e8-4abe-885f-d4f79ce9a00a-config\") pod \"route-controller-manager-6cb978c9f5-7497s\" (UID: \"bcc97a3b-b2e8-4abe-885f-d4f79ce9a00a\") " pod="openshift-route-controller-manager/route-controller-manager-6cb978c9f5-7497s" Mar 21 03:50:27 crc kubenswrapper[4685]: I0321 03:50:27.876084 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bcc97a3b-b2e8-4abe-885f-d4f79ce9a00a-client-ca\") pod \"route-controller-manager-6cb978c9f5-7497s\" (UID: \"bcc97a3b-b2e8-4abe-885f-d4f79ce9a00a\") " pod="openshift-route-controller-manager/route-controller-manager-6cb978c9f5-7497s" Mar 21 03:50:27 crc kubenswrapper[4685]: I0321 03:50:27.877946 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bcc97a3b-b2e8-4abe-885f-d4f79ce9a00a-serving-cert\") pod \"route-controller-manager-6cb978c9f5-7497s\" (UID: \"bcc97a3b-b2e8-4abe-885f-d4f79ce9a00a\") " pod="openshift-route-controller-manager/route-controller-manager-6cb978c9f5-7497s" Mar 21 03:50:27 crc kubenswrapper[4685]: I0321 03:50:27.896717 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6r7xh\" (UniqueName: \"kubernetes.io/projected/bcc97a3b-b2e8-4abe-885f-d4f79ce9a00a-kube-api-access-6r7xh\") pod \"route-controller-manager-6cb978c9f5-7497s\" (UID: \"bcc97a3b-b2e8-4abe-885f-d4f79ce9a00a\") " pod="openshift-route-controller-manager/route-controller-manager-6cb978c9f5-7497s" Mar 21 03:50:27 crc kubenswrapper[4685]: I0321 03:50:27.917622 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6cb978c9f5-7497s" Mar 21 03:50:28 crc kubenswrapper[4685]: I0321 03:50:28.355005 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7c5b579c85-zbdrh" event={"ID":"b246afbf-fbd5-4f39-9c95-08aab014dda0","Type":"ContainerDied","Data":"fa68fee35f1ebb62703e132435761b3d42b001d026111b18a30f0e2594ad0e05"} Mar 21 03:50:28 crc kubenswrapper[4685]: I0321 03:50:28.355338 4685 scope.go:117] "RemoveContainer" containerID="345685d8579eeb045c1610555ab08ac62f971fc6f14df13865bf2151b3d2739e" Mar 21 03:50:28 crc kubenswrapper[4685]: I0321 03:50:28.355283 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7c5b579c85-zbdrh" Mar 21 03:50:28 crc kubenswrapper[4685]: I0321 03:50:28.359952 4685 generic.go:334] "Generic (PLEG): container finished" podID="4d6c412c-bc35-4360-91b0-06f8b60e7106" containerID="6d88af22f84c63a6294a89516a0484847f94f27b741e4b1dac713f35487b7d05" exitCode=0 Mar 21 03:50:28 crc kubenswrapper[4685]: I0321 03:50:28.359986 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lnbh8" event={"ID":"4d6c412c-bc35-4360-91b0-06f8b60e7106","Type":"ContainerDied","Data":"6d88af22f84c63a6294a89516a0484847f94f27b741e4b1dac713f35487b7d05"} Mar 21 03:50:28 crc kubenswrapper[4685]: I0321 03:50:28.361613 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8qq2k" event={"ID":"855b8c82-585a-4883-acdc-195377b480c2","Type":"ContainerStarted","Data":"9f6147bc58108bfff3269ed69ea4835e65ba044a497c4dc7dd1d1a1c4b9ea31e"} Mar 21 03:50:28 crc kubenswrapper[4685]: I0321 03:50:28.374932 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c5b579c85-zbdrh"] Mar 21 03:50:28 crc kubenswrapper[4685]: I0321 03:50:28.375205 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c5b579c85-zbdrh"] Mar 21 03:50:29 crc kubenswrapper[4685]: E0321 03:50:29.004609 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-mhc9s" podUID="931ed0e7-7ffb-48ba-92b0-28883a6f0b39" Mar 21 03:50:29 crc kubenswrapper[4685]: E0321 03:50:29.004702 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-7swf2" podUID="d4ebdf18-8426-42cc-93a6-60b46261aebe" Mar 21 03:50:29 crc kubenswrapper[4685]: E0321 03:50:29.060194 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-xvjs7" podUID="53ad7ff2-a7d3-4ad6-8e97-14542e4a99cd" Mar 21 03:50:29 crc kubenswrapper[4685]: E0321 03:50:29.084300 4685 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 21 03:50:29 crc kubenswrapper[4685]: E0321 03:50:29.084463 4685 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-px57g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-s6khh_openshift-marketplace(9c1f4e4f-a993-423e-8922-d8b81967d483): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 21 03:50:29 crc kubenswrapper[4685]: E0321 03:50:29.085813 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-s6khh" podUID="9c1f4e4f-a993-423e-8922-d8b81967d483" Mar 21 03:50:29 crc kubenswrapper[4685]: E0321 03:50:29.095424 4685 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 21 03:50:29 crc kubenswrapper[4685]: E0321 03:50:29.095591 4685 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cz76z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-zrnqq_openshift-marketplace(f0e0a768-d2ec-4986-8cc7-72f0bd1d285a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 21 03:50:29 crc kubenswrapper[4685]: E0321 03:50:29.096748 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-zrnqq" podUID="f0e0a768-d2ec-4986-8cc7-72f0bd1d285a" Mar 21 03:50:29 crc kubenswrapper[4685]: I0321 03:50:29.104920 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-cf75f487f-fj8jm" Mar 21 03:50:29 crc kubenswrapper[4685]: I0321 03:50:29.191998 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/98f91583-6a92-4490-96d0-a15aa61f30bb-client-ca\") pod \"98f91583-6a92-4490-96d0-a15aa61f30bb\" (UID: \"98f91583-6a92-4490-96d0-a15aa61f30bb\") " Mar 21 03:50:29 crc kubenswrapper[4685]: I0321 03:50:29.192424 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zmlbc\" (UniqueName: \"kubernetes.io/projected/98f91583-6a92-4490-96d0-a15aa61f30bb-kube-api-access-zmlbc\") pod \"98f91583-6a92-4490-96d0-a15aa61f30bb\" (UID: \"98f91583-6a92-4490-96d0-a15aa61f30bb\") " Mar 21 03:50:29 crc kubenswrapper[4685]: I0321 03:50:29.192470 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98f91583-6a92-4490-96d0-a15aa61f30bb-serving-cert\") pod \"98f91583-6a92-4490-96d0-a15aa61f30bb\" (UID: \"98f91583-6a92-4490-96d0-a15aa61f30bb\") " Mar 21 03:50:29 crc kubenswrapper[4685]: I0321 03:50:29.192516 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98f91583-6a92-4490-96d0-a15aa61f30bb-config\") pod \"98f91583-6a92-4490-96d0-a15aa61f30bb\" (UID: \"98f91583-6a92-4490-96d0-a15aa61f30bb\") " Mar 21 03:50:29 crc kubenswrapper[4685]: I0321 03:50:29.192552 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/98f91583-6a92-4490-96d0-a15aa61f30bb-proxy-ca-bundles\") pod \"98f91583-6a92-4490-96d0-a15aa61f30bb\" (UID: \"98f91583-6a92-4490-96d0-a15aa61f30bb\") " Mar 21 03:50:29 crc kubenswrapper[4685]: I0321 03:50:29.192794 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98f91583-6a92-4490-96d0-a15aa61f30bb-client-ca" (OuterVolumeSpecName: "client-ca") pod "98f91583-6a92-4490-96d0-a15aa61f30bb" (UID: "98f91583-6a92-4490-96d0-a15aa61f30bb"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:50:29 crc kubenswrapper[4685]: I0321 03:50:29.193353 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98f91583-6a92-4490-96d0-a15aa61f30bb-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "98f91583-6a92-4490-96d0-a15aa61f30bb" (UID: "98f91583-6a92-4490-96d0-a15aa61f30bb"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:50:29 crc kubenswrapper[4685]: I0321 03:50:29.194261 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98f91583-6a92-4490-96d0-a15aa61f30bb-config" (OuterVolumeSpecName: "config") pod "98f91583-6a92-4490-96d0-a15aa61f30bb" (UID: "98f91583-6a92-4490-96d0-a15aa61f30bb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:50:29 crc kubenswrapper[4685]: I0321 03:50:29.197873 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98f91583-6a92-4490-96d0-a15aa61f30bb-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "98f91583-6a92-4490-96d0-a15aa61f30bb" (UID: "98f91583-6a92-4490-96d0-a15aa61f30bb"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 03:50:29 crc kubenswrapper[4685]: I0321 03:50:29.203757 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98f91583-6a92-4490-96d0-a15aa61f30bb-kube-api-access-zmlbc" (OuterVolumeSpecName: "kube-api-access-zmlbc") pod "98f91583-6a92-4490-96d0-a15aa61f30bb" (UID: "98f91583-6a92-4490-96d0-a15aa61f30bb"). InnerVolumeSpecName "kube-api-access-zmlbc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:50:29 crc kubenswrapper[4685]: I0321 03:50:29.296884 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zmlbc\" (UniqueName: \"kubernetes.io/projected/98f91583-6a92-4490-96d0-a15aa61f30bb-kube-api-access-zmlbc\") on node \"crc\" DevicePath \"\"" Mar 21 03:50:29 crc kubenswrapper[4685]: I0321 03:50:29.297237 4685 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98f91583-6a92-4490-96d0-a15aa61f30bb-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 03:50:29 crc kubenswrapper[4685]: I0321 03:50:29.297251 4685 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98f91583-6a92-4490-96d0-a15aa61f30bb-config\") on node \"crc\" DevicePath \"\"" Mar 21 03:50:29 crc kubenswrapper[4685]: I0321 03:50:29.297263 4685 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/98f91583-6a92-4490-96d0-a15aa61f30bb-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 21 03:50:29 crc kubenswrapper[4685]: I0321 03:50:29.297274 4685 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/98f91583-6a92-4490-96d0-a15aa61f30bb-client-ca\") on node \"crc\" DevicePath \"\"" Mar 21 03:50:29 crc kubenswrapper[4685]: I0321 03:50:29.376923 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-cf75f487f-fj8jm" Mar 21 03:50:29 crc kubenswrapper[4685]: I0321 03:50:29.376905 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-cf75f487f-fj8jm" event={"ID":"98f91583-6a92-4490-96d0-a15aa61f30bb","Type":"ContainerDied","Data":"26d9cba37f46a5cbcc45279b257379996cc49b7a40ccd4484d0b6bf5ca73b867"} Mar 21 03:50:29 crc kubenswrapper[4685]: I0321 03:50:29.377073 4685 scope.go:117] "RemoveContainer" containerID="21ba54647eee116126737f67116e01f945c7bbdda2c86646d7cbb15b2bffd206" Mar 21 03:50:29 crc kubenswrapper[4685]: I0321 03:50:29.380510 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2dcgr" event={"ID":"f7b276b3-8b85-4cfe-a39c-73da270336e3","Type":"ContainerStarted","Data":"2266ae0d76cc2d2073b2a0a2ac500b2377c0783e8373ad6fe093726bd17ef0aa"} Mar 21 03:50:29 crc kubenswrapper[4685]: I0321 03:50:29.382683 4685 generic.go:334] "Generic (PLEG): container finished" podID="855b8c82-585a-4883-acdc-195377b480c2" containerID="821c089f9c6fa6ae58872ec41bd332232ad8efc8e3e1194d906407aa95ebc527" exitCode=0 Mar 21 03:50:29 crc kubenswrapper[4685]: I0321 03:50:29.383025 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8qq2k" event={"ID":"855b8c82-585a-4883-acdc-195377b480c2","Type":"ContainerDied","Data":"821c089f9c6fa6ae58872ec41bd332232ad8efc8e3e1194d906407aa95ebc527"} Mar 21 03:50:29 crc kubenswrapper[4685]: E0321 03:50:29.384880 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-s6khh" podUID="9c1f4e4f-a993-423e-8922-d8b81967d483" Mar 21 03:50:29 crc kubenswrapper[4685]: E0321 03:50:29.385114 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-zrnqq" podUID="f0e0a768-d2ec-4986-8cc7-72f0bd1d285a" Mar 21 03:50:29 crc kubenswrapper[4685]: I0321 03:50:29.489297 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-cf75f487f-fj8jm"] Mar 21 03:50:29 crc kubenswrapper[4685]: I0321 03:50:29.495982 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-cf75f487f-fj8jm"] Mar 21 03:50:29 crc kubenswrapper[4685]: I0321 03:50:29.554238 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6cb978c9f5-7497s"] Mar 21 03:50:29 crc kubenswrapper[4685]: W0321 03:50:29.610148 4685 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbcc97a3b_b2e8_4abe_885f_d4f79ce9a00a.slice/crio-0cee1203e73e48a68a410edf882122c22fb0a2cfbc70b6efffe9b2d5913080c2 WatchSource:0}: Error finding container 0cee1203e73e48a68a410edf882122c22fb0a2cfbc70b6efffe9b2d5913080c2: Status 404 returned error can't find the container with id 0cee1203e73e48a68a410edf882122c22fb0a2cfbc70b6efffe9b2d5913080c2 Mar 21 03:50:29 crc kubenswrapper[4685]: I0321 03:50:29.867903 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 21 03:50:29 crc kubenswrapper[4685]: E0321 03:50:29.868382 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98f91583-6a92-4490-96d0-a15aa61f30bb" containerName="controller-manager" Mar 21 03:50:29 crc kubenswrapper[4685]: I0321 03:50:29.868405 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="98f91583-6a92-4490-96d0-a15aa61f30bb" containerName="controller-manager" Mar 21 03:50:29 crc kubenswrapper[4685]: I0321 03:50:29.868502 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="98f91583-6a92-4490-96d0-a15aa61f30bb" containerName="controller-manager" Mar 21 03:50:29 crc kubenswrapper[4685]: I0321 03:50:29.868905 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 21 03:50:29 crc kubenswrapper[4685]: I0321 03:50:29.870870 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 21 03:50:29 crc kubenswrapper[4685]: I0321 03:50:29.871330 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 21 03:50:29 crc kubenswrapper[4685]: I0321 03:50:29.876611 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 21 03:50:29 crc kubenswrapper[4685]: I0321 03:50:29.910964 4685 csr.go:261] certificate signing request csr-ql2ft is approved, waiting to be issued Mar 21 03:50:29 crc kubenswrapper[4685]: I0321 03:50:29.918746 4685 csr.go:257] certificate signing request csr-ql2ft is issued Mar 21 03:50:30 crc kubenswrapper[4685]: I0321 03:50:30.004468 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ea872c47-e267-4556-aede-151cf2e0fc17-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"ea872c47-e267-4556-aede-151cf2e0fc17\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 21 03:50:30 crc kubenswrapper[4685]: I0321 03:50:30.004708 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ea872c47-e267-4556-aede-151cf2e0fc17-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"ea872c47-e267-4556-aede-151cf2e0fc17\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 21 03:50:30 crc kubenswrapper[4685]: E0321 03:50:30.008063 4685 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a68d63c_113c_4421_9444_78d05d636874.slice/crio-15cadc160d591c0cd64e34f00ca0765d613d878a4c236a5abb768732fbb0c4de.scope\": RecentStats: unable to find data in memory cache]" Mar 21 03:50:30 crc kubenswrapper[4685]: I0321 03:50:30.105572 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ea872c47-e267-4556-aede-151cf2e0fc17-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"ea872c47-e267-4556-aede-151cf2e0fc17\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 21 03:50:30 crc kubenswrapper[4685]: I0321 03:50:30.105655 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ea872c47-e267-4556-aede-151cf2e0fc17-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"ea872c47-e267-4556-aede-151cf2e0fc17\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 21 03:50:30 crc kubenswrapper[4685]: I0321 03:50:30.106096 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ea872c47-e267-4556-aede-151cf2e0fc17-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"ea872c47-e267-4556-aede-151cf2e0fc17\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 21 03:50:30 crc kubenswrapper[4685]: I0321 03:50:30.123742 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ea872c47-e267-4556-aede-151cf2e0fc17-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"ea872c47-e267-4556-aede-151cf2e0fc17\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 21 03:50:30 crc kubenswrapper[4685]: I0321 03:50:30.183115 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 21 03:50:30 crc kubenswrapper[4685]: I0321 03:50:30.312027 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98f91583-6a92-4490-96d0-a15aa61f30bb" path="/var/lib/kubelet/pods/98f91583-6a92-4490-96d0-a15aa61f30bb/volumes" Mar 21 03:50:30 crc kubenswrapper[4685]: I0321 03:50:30.312848 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b246afbf-fbd5-4f39-9c95-08aab014dda0" path="/var/lib/kubelet/pods/b246afbf-fbd5-4f39-9c95-08aab014dda0/volumes" Mar 21 03:50:30 crc kubenswrapper[4685]: I0321 03:50:30.395459 4685 generic.go:334] "Generic (PLEG): container finished" podID="4ede3f08-f29b-4cb9-a96f-1c66239498f6" containerID="83f58c6672a9d219e7c6fe2461498dcaa634db2007f4484fb3e6c7781cfa6bf9" exitCode=0 Mar 21 03:50:30 crc kubenswrapper[4685]: I0321 03:50:30.395565 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567748-zv7h8" event={"ID":"4ede3f08-f29b-4cb9-a96f-1c66239498f6","Type":"ContainerDied","Data":"83f58c6672a9d219e7c6fe2461498dcaa634db2007f4484fb3e6c7781cfa6bf9"} Mar 21 03:50:30 crc kubenswrapper[4685]: I0321 03:50:30.398416 4685 generic.go:334] "Generic (PLEG): container finished" podID="f7b276b3-8b85-4cfe-a39c-73da270336e3" containerID="2266ae0d76cc2d2073b2a0a2ac500b2377c0783e8373ad6fe093726bd17ef0aa" exitCode=0 Mar 21 03:50:30 crc kubenswrapper[4685]: I0321 03:50:30.398501 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2dcgr" event={"ID":"f7b276b3-8b85-4cfe-a39c-73da270336e3","Type":"ContainerDied","Data":"2266ae0d76cc2d2073b2a0a2ac500b2377c0783e8373ad6fe093726bd17ef0aa"} Mar 21 03:50:30 crc kubenswrapper[4685]: I0321 03:50:30.401947 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6cb978c9f5-7497s" event={"ID":"bcc97a3b-b2e8-4abe-885f-d4f79ce9a00a","Type":"ContainerStarted","Data":"c4109a761f654c4cea80457a301a9250eeca2a8dc37ed5a953f9fa2bbae1da3c"} Mar 21 03:50:30 crc kubenswrapper[4685]: I0321 03:50:30.401985 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6cb978c9f5-7497s" event={"ID":"bcc97a3b-b2e8-4abe-885f-d4f79ce9a00a","Type":"ContainerStarted","Data":"0cee1203e73e48a68a410edf882122c22fb0a2cfbc70b6efffe9b2d5913080c2"} Mar 21 03:50:30 crc kubenswrapper[4685]: I0321 03:50:30.402392 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6cb978c9f5-7497s" Mar 21 03:50:30 crc kubenswrapper[4685]: I0321 03:50:30.405350 4685 generic.go:334] "Generic (PLEG): container finished" podID="2a68d63c-113c-4421-9444-78d05d636874" containerID="15cadc160d591c0cd64e34f00ca0765d613d878a4c236a5abb768732fbb0c4de" exitCode=0 Mar 21 03:50:30 crc kubenswrapper[4685]: I0321 03:50:30.405403 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567750-hlpdj" event={"ID":"2a68d63c-113c-4421-9444-78d05d636874","Type":"ContainerDied","Data":"15cadc160d591c0cd64e34f00ca0765d613d878a4c236a5abb768732fbb0c4de"} Mar 21 03:50:30 crc kubenswrapper[4685]: I0321 03:50:30.408083 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6cb978c9f5-7497s" Mar 21 03:50:30 crc kubenswrapper[4685]: I0321 03:50:30.434727 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6cb978c9f5-7497s" podStartSLOduration=18.434704571 podStartE2EDuration="18.434704571s" podCreationTimestamp="2026-03-21 03:50:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 03:50:30.429543944 +0000 UTC m=+262.906612746" watchObservedRunningTime="2026-03-21 03:50:30.434704571 +0000 UTC m=+262.911773363" Mar 21 03:50:30 crc kubenswrapper[4685]: I0321 03:50:30.500621 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-9ff898d84-mcxth"] Mar 21 03:50:30 crc kubenswrapper[4685]: I0321 03:50:30.501441 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-9ff898d84-mcxth" Mar 21 03:50:30 crc kubenswrapper[4685]: I0321 03:50:30.504089 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 21 03:50:30 crc kubenswrapper[4685]: I0321 03:50:30.504142 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 21 03:50:30 crc kubenswrapper[4685]: I0321 03:50:30.508168 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 21 03:50:30 crc kubenswrapper[4685]: I0321 03:50:30.508186 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 21 03:50:30 crc kubenswrapper[4685]: I0321 03:50:30.509541 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 21 03:50:30 crc kubenswrapper[4685]: I0321 03:50:30.509768 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 21 03:50:30 crc kubenswrapper[4685]: I0321 03:50:30.511646 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-9ff898d84-mcxth"] Mar 21 03:50:30 crc kubenswrapper[4685]: I0321 03:50:30.512720 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 21 03:50:30 crc kubenswrapper[4685]: I0321 03:50:30.616271 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvx99\" (UniqueName: \"kubernetes.io/projected/80c17df6-fec5-457a-a362-6b3f7e6e7bef-kube-api-access-wvx99\") pod \"controller-manager-9ff898d84-mcxth\" (UID: \"80c17df6-fec5-457a-a362-6b3f7e6e7bef\") " pod="openshift-controller-manager/controller-manager-9ff898d84-mcxth" Mar 21 03:50:30 crc kubenswrapper[4685]: I0321 03:50:30.616320 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80c17df6-fec5-457a-a362-6b3f7e6e7bef-config\") pod \"controller-manager-9ff898d84-mcxth\" (UID: \"80c17df6-fec5-457a-a362-6b3f7e6e7bef\") " pod="openshift-controller-manager/controller-manager-9ff898d84-mcxth" Mar 21 03:50:30 crc kubenswrapper[4685]: I0321 03:50:30.616344 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/80c17df6-fec5-457a-a362-6b3f7e6e7bef-proxy-ca-bundles\") pod \"controller-manager-9ff898d84-mcxth\" (UID: \"80c17df6-fec5-457a-a362-6b3f7e6e7bef\") " pod="openshift-controller-manager/controller-manager-9ff898d84-mcxth" Mar 21 03:50:30 crc kubenswrapper[4685]: I0321 03:50:30.616369 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/80c17df6-fec5-457a-a362-6b3f7e6e7bef-serving-cert\") pod \"controller-manager-9ff898d84-mcxth\" (UID: \"80c17df6-fec5-457a-a362-6b3f7e6e7bef\") " pod="openshift-controller-manager/controller-manager-9ff898d84-mcxth" Mar 21 03:50:30 crc kubenswrapper[4685]: I0321 03:50:30.616436 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/80c17df6-fec5-457a-a362-6b3f7e6e7bef-client-ca\") pod \"controller-manager-9ff898d84-mcxth\" (UID: \"80c17df6-fec5-457a-a362-6b3f7e6e7bef\") " pod="openshift-controller-manager/controller-manager-9ff898d84-mcxth" Mar 21 03:50:30 crc kubenswrapper[4685]: I0321 03:50:30.629210 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 21 03:50:30 crc kubenswrapper[4685]: W0321 03:50:30.637754 4685 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podea872c47_e267_4556_aede_151cf2e0fc17.slice/crio-0b4be67649316f01904cb294dcce6dff4f391171e651f4f070cf194e3a60e5af WatchSource:0}: Error finding container 0b4be67649316f01904cb294dcce6dff4f391171e651f4f070cf194e3a60e5af: Status 404 returned error can't find the container with id 0b4be67649316f01904cb294dcce6dff4f391171e651f4f070cf194e3a60e5af Mar 21 03:50:30 crc kubenswrapper[4685]: I0321 03:50:30.717259 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/80c17df6-fec5-457a-a362-6b3f7e6e7bef-client-ca\") pod \"controller-manager-9ff898d84-mcxth\" (UID: \"80c17df6-fec5-457a-a362-6b3f7e6e7bef\") " pod="openshift-controller-manager/controller-manager-9ff898d84-mcxth" Mar 21 03:50:30 crc kubenswrapper[4685]: I0321 03:50:30.717356 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvx99\" (UniqueName: \"kubernetes.io/projected/80c17df6-fec5-457a-a362-6b3f7e6e7bef-kube-api-access-wvx99\") pod \"controller-manager-9ff898d84-mcxth\" (UID: \"80c17df6-fec5-457a-a362-6b3f7e6e7bef\") " pod="openshift-controller-manager/controller-manager-9ff898d84-mcxth" Mar 21 03:50:30 crc kubenswrapper[4685]: I0321 03:50:30.717402 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80c17df6-fec5-457a-a362-6b3f7e6e7bef-config\") pod \"controller-manager-9ff898d84-mcxth\" (UID: \"80c17df6-fec5-457a-a362-6b3f7e6e7bef\") " pod="openshift-controller-manager/controller-manager-9ff898d84-mcxth" Mar 21 03:50:30 crc kubenswrapper[4685]: I0321 03:50:30.717430 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/80c17df6-fec5-457a-a362-6b3f7e6e7bef-proxy-ca-bundles\") pod \"controller-manager-9ff898d84-mcxth\" (UID: \"80c17df6-fec5-457a-a362-6b3f7e6e7bef\") " pod="openshift-controller-manager/controller-manager-9ff898d84-mcxth" Mar 21 03:50:30 crc kubenswrapper[4685]: I0321 03:50:30.717467 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/80c17df6-fec5-457a-a362-6b3f7e6e7bef-serving-cert\") pod \"controller-manager-9ff898d84-mcxth\" (UID: \"80c17df6-fec5-457a-a362-6b3f7e6e7bef\") " pod="openshift-controller-manager/controller-manager-9ff898d84-mcxth" Mar 21 03:50:30 crc kubenswrapper[4685]: I0321 03:50:30.718191 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/80c17df6-fec5-457a-a362-6b3f7e6e7bef-client-ca\") pod \"controller-manager-9ff898d84-mcxth\" (UID: \"80c17df6-fec5-457a-a362-6b3f7e6e7bef\") " pod="openshift-controller-manager/controller-manager-9ff898d84-mcxth" Mar 21 03:50:30 crc kubenswrapper[4685]: I0321 03:50:30.719548 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80c17df6-fec5-457a-a362-6b3f7e6e7bef-config\") pod \"controller-manager-9ff898d84-mcxth\" (UID: \"80c17df6-fec5-457a-a362-6b3f7e6e7bef\") " pod="openshift-controller-manager/controller-manager-9ff898d84-mcxth" Mar 21 03:50:30 crc kubenswrapper[4685]: I0321 03:50:30.719613 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/80c17df6-fec5-457a-a362-6b3f7e6e7bef-proxy-ca-bundles\") pod \"controller-manager-9ff898d84-mcxth\" (UID: \"80c17df6-fec5-457a-a362-6b3f7e6e7bef\") " pod="openshift-controller-manager/controller-manager-9ff898d84-mcxth" Mar 21 03:50:30 crc kubenswrapper[4685]: I0321 03:50:30.726700 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/80c17df6-fec5-457a-a362-6b3f7e6e7bef-serving-cert\") pod \"controller-manager-9ff898d84-mcxth\" (UID: \"80c17df6-fec5-457a-a362-6b3f7e6e7bef\") " pod="openshift-controller-manager/controller-manager-9ff898d84-mcxth" Mar 21 03:50:30 crc kubenswrapper[4685]: I0321 03:50:30.733233 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvx99\" (UniqueName: \"kubernetes.io/projected/80c17df6-fec5-457a-a362-6b3f7e6e7bef-kube-api-access-wvx99\") pod \"controller-manager-9ff898d84-mcxth\" (UID: \"80c17df6-fec5-457a-a362-6b3f7e6e7bef\") " pod="openshift-controller-manager/controller-manager-9ff898d84-mcxth" Mar 21 03:50:30 crc kubenswrapper[4685]: I0321 03:50:30.825329 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-9ff898d84-mcxth" Mar 21 03:50:30 crc kubenswrapper[4685]: I0321 03:50:30.921458 4685 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-19 06:48:30.048020005 +0000 UTC Mar 21 03:50:30 crc kubenswrapper[4685]: I0321 03:50:30.921806 4685 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6554h57m59.126217151s for next certificate rotation Mar 21 03:50:31 crc kubenswrapper[4685]: I0321 03:50:31.223046 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-9ff898d84-mcxth"] Mar 21 03:50:31 crc kubenswrapper[4685]: W0321 03:50:31.230512 4685 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80c17df6_fec5_457a_a362_6b3f7e6e7bef.slice/crio-72460cf0548d60bf457848480d6e07fcd161a3d6e59ec1bd3d6c4e7b896a6d5a WatchSource:0}: Error finding container 72460cf0548d60bf457848480d6e07fcd161a3d6e59ec1bd3d6c4e7b896a6d5a: Status 404 returned error can't find the container with id 72460cf0548d60bf457848480d6e07fcd161a3d6e59ec1bd3d6c4e7b896a6d5a Mar 21 03:50:31 crc kubenswrapper[4685]: I0321 03:50:31.416824 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2dcgr" event={"ID":"f7b276b3-8b85-4cfe-a39c-73da270336e3","Type":"ContainerStarted","Data":"10cac42124e6af2d450cc0c4f55f51011571dfaac34b721984bfa9ad061b9436"} Mar 21 03:50:31 crc kubenswrapper[4685]: I0321 03:50:31.420440 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"ea872c47-e267-4556-aede-151cf2e0fc17","Type":"ContainerStarted","Data":"9023e08a68fc92d0eeddbd4f2dc619f82b1e0034b29e29b6c82c7e48a4c4e3b3"} Mar 21 03:50:31 crc kubenswrapper[4685]: I0321 03:50:31.420492 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"ea872c47-e267-4556-aede-151cf2e0fc17","Type":"ContainerStarted","Data":"0b4be67649316f01904cb294dcce6dff4f391171e651f4f070cf194e3a60e5af"} Mar 21 03:50:31 crc kubenswrapper[4685]: I0321 03:50:31.423353 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-9ff898d84-mcxth" event={"ID":"80c17df6-fec5-457a-a362-6b3f7e6e7bef","Type":"ContainerStarted","Data":"bc6fdd2e4dd3ccc0ae5f24da0413313612131fcc23534db6537a83fd14bb4700"} Mar 21 03:50:31 crc kubenswrapper[4685]: I0321 03:50:31.423420 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-9ff898d84-mcxth" event={"ID":"80c17df6-fec5-457a-a362-6b3f7e6e7bef","Type":"ContainerStarted","Data":"72460cf0548d60bf457848480d6e07fcd161a3d6e59ec1bd3d6c4e7b896a6d5a"} Mar 21 03:50:31 crc kubenswrapper[4685]: I0321 03:50:31.459910 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-9ff898d84-mcxth" podStartSLOduration=19.459888734 podStartE2EDuration="19.459888734s" podCreationTimestamp="2026-03-21 03:50:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 03:50:31.459473521 +0000 UTC m=+263.936542313" watchObservedRunningTime="2026-03-21 03:50:31.459888734 +0000 UTC m=+263.936957536" Mar 21 03:50:31 crc kubenswrapper[4685]: I0321 03:50:31.461723 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2dcgr" podStartSLOduration=3.016180849 podStartE2EDuration="37.461715579s" podCreationTimestamp="2026-03-21 03:49:54 +0000 UTC" firstStartedPulling="2026-03-21 03:49:56.661410532 +0000 UTC m=+229.138479324" lastFinishedPulling="2026-03-21 03:50:31.106945272 +0000 UTC m=+263.584014054" observedRunningTime="2026-03-21 03:50:31.437684 +0000 UTC m=+263.914752792" watchObservedRunningTime="2026-03-21 03:50:31.461715579 +0000 UTC m=+263.938784371" Mar 21 03:50:31 crc kubenswrapper[4685]: I0321 03:50:31.479193 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=2.479176469 podStartE2EDuration="2.479176469s" podCreationTimestamp="2026-03-21 03:50:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 03:50:31.477275352 +0000 UTC m=+263.954344144" watchObservedRunningTime="2026-03-21 03:50:31.479176469 +0000 UTC m=+263.956245261" Mar 21 03:50:31 crc kubenswrapper[4685]: I0321 03:50:31.792240 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567748-zv7h8" Mar 21 03:50:31 crc kubenswrapper[4685]: I0321 03:50:31.806056 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567750-hlpdj" Mar 21 03:50:31 crc kubenswrapper[4685]: I0321 03:50:31.840211 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kgshs\" (UniqueName: \"kubernetes.io/projected/4ede3f08-f29b-4cb9-a96f-1c66239498f6-kube-api-access-kgshs\") pod \"4ede3f08-f29b-4cb9-a96f-1c66239498f6\" (UID: \"4ede3f08-f29b-4cb9-a96f-1c66239498f6\") " Mar 21 03:50:31 crc kubenswrapper[4685]: I0321 03:50:31.840270 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p672f\" (UniqueName: \"kubernetes.io/projected/2a68d63c-113c-4421-9444-78d05d636874-kube-api-access-p672f\") pod \"2a68d63c-113c-4421-9444-78d05d636874\" (UID: \"2a68d63c-113c-4421-9444-78d05d636874\") " Mar 21 03:50:31 crc kubenswrapper[4685]: I0321 03:50:31.846119 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ede3f08-f29b-4cb9-a96f-1c66239498f6-kube-api-access-kgshs" (OuterVolumeSpecName: "kube-api-access-kgshs") pod "4ede3f08-f29b-4cb9-a96f-1c66239498f6" (UID: "4ede3f08-f29b-4cb9-a96f-1c66239498f6"). InnerVolumeSpecName "kube-api-access-kgshs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:50:31 crc kubenswrapper[4685]: I0321 03:50:31.846567 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a68d63c-113c-4421-9444-78d05d636874-kube-api-access-p672f" (OuterVolumeSpecName: "kube-api-access-p672f") pod "2a68d63c-113c-4421-9444-78d05d636874" (UID: "2a68d63c-113c-4421-9444-78d05d636874"). InnerVolumeSpecName "kube-api-access-p672f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:50:31 crc kubenswrapper[4685]: I0321 03:50:31.922208 4685 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-02 00:30:21.558258359 +0000 UTC Mar 21 03:50:31 crc kubenswrapper[4685]: I0321 03:50:31.922934 4685 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6884h39m49.635331283s for next certificate rotation Mar 21 03:50:31 crc kubenswrapper[4685]: I0321 03:50:31.941223 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kgshs\" (UniqueName: \"kubernetes.io/projected/4ede3f08-f29b-4cb9-a96f-1c66239498f6-kube-api-access-kgshs\") on node \"crc\" DevicePath \"\"" Mar 21 03:50:31 crc kubenswrapper[4685]: I0321 03:50:31.941255 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p672f\" (UniqueName: \"kubernetes.io/projected/2a68d63c-113c-4421-9444-78d05d636874-kube-api-access-p672f\") on node \"crc\" DevicePath \"\"" Mar 21 03:50:32 crc kubenswrapper[4685]: I0321 03:50:32.430167 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567748-zv7h8" event={"ID":"4ede3f08-f29b-4cb9-a96f-1c66239498f6","Type":"ContainerDied","Data":"2733a1272a0f7b10b0af660e7184383bdad8b10a00fc70bf8280bbe070f5e258"} Mar 21 03:50:32 crc kubenswrapper[4685]: I0321 03:50:32.430210 4685 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2733a1272a0f7b10b0af660e7184383bdad8b10a00fc70bf8280bbe070f5e258" Mar 21 03:50:32 crc kubenswrapper[4685]: I0321 03:50:32.430272 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567748-zv7h8" Mar 21 03:50:32 crc kubenswrapper[4685]: I0321 03:50:32.433522 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567750-hlpdj" event={"ID":"2a68d63c-113c-4421-9444-78d05d636874","Type":"ContainerDied","Data":"e321bb85f49437c165b0c5a6b71d42d182d8a68f6c24afd740ccab510d39c3ce"} Mar 21 03:50:32 crc kubenswrapper[4685]: I0321 03:50:32.433560 4685 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e321bb85f49437c165b0c5a6b71d42d182d8a68f6c24afd740ccab510d39c3ce" Mar 21 03:50:32 crc kubenswrapper[4685]: I0321 03:50:32.433530 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567750-hlpdj" Mar 21 03:50:32 crc kubenswrapper[4685]: I0321 03:50:32.435582 4685 generic.go:334] "Generic (PLEG): container finished" podID="ea872c47-e267-4556-aede-151cf2e0fc17" containerID="9023e08a68fc92d0eeddbd4f2dc619f82b1e0034b29e29b6c82c7e48a4c4e3b3" exitCode=0 Mar 21 03:50:32 crc kubenswrapper[4685]: I0321 03:50:32.436473 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"ea872c47-e267-4556-aede-151cf2e0fc17","Type":"ContainerDied","Data":"9023e08a68fc92d0eeddbd4f2dc619f82b1e0034b29e29b6c82c7e48a4c4e3b3"} Mar 21 03:50:32 crc kubenswrapper[4685]: I0321 03:50:32.436554 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-9ff898d84-mcxth" Mar 21 03:50:32 crc kubenswrapper[4685]: I0321 03:50:32.444094 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-9ff898d84-mcxth" Mar 21 03:50:32 crc kubenswrapper[4685]: I0321 03:50:32.590204 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-9ff898d84-mcxth"] Mar 21 03:50:32 crc kubenswrapper[4685]: I0321 03:50:32.686263 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6cb978c9f5-7497s"] Mar 21 03:50:33 crc kubenswrapper[4685]: I0321 03:50:33.442672 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6cb978c9f5-7497s" podUID="bcc97a3b-b2e8-4abe-885f-d4f79ce9a00a" containerName="route-controller-manager" containerID="cri-o://c4109a761f654c4cea80457a301a9250eeca2a8dc37ed5a953f9fa2bbae1da3c" gracePeriod=30 Mar 21 03:50:33 crc kubenswrapper[4685]: I0321 03:50:33.740005 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 21 03:50:33 crc kubenswrapper[4685]: I0321 03:50:33.774367 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ea872c47-e267-4556-aede-151cf2e0fc17-kube-api-access\") pod \"ea872c47-e267-4556-aede-151cf2e0fc17\" (UID: \"ea872c47-e267-4556-aede-151cf2e0fc17\") " Mar 21 03:50:33 crc kubenswrapper[4685]: I0321 03:50:33.774491 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ea872c47-e267-4556-aede-151cf2e0fc17-kubelet-dir\") pod \"ea872c47-e267-4556-aede-151cf2e0fc17\" (UID: \"ea872c47-e267-4556-aede-151cf2e0fc17\") " Mar 21 03:50:33 crc kubenswrapper[4685]: I0321 03:50:33.774612 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ea872c47-e267-4556-aede-151cf2e0fc17-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "ea872c47-e267-4556-aede-151cf2e0fc17" (UID: "ea872c47-e267-4556-aede-151cf2e0fc17"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 03:50:33 crc kubenswrapper[4685]: I0321 03:50:33.774872 4685 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ea872c47-e267-4556-aede-151cf2e0fc17-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 21 03:50:33 crc kubenswrapper[4685]: I0321 03:50:33.782353 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea872c47-e267-4556-aede-151cf2e0fc17-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "ea872c47-e267-4556-aede-151cf2e0fc17" (UID: "ea872c47-e267-4556-aede-151cf2e0fc17"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:50:33 crc kubenswrapper[4685]: I0321 03:50:33.875579 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ea872c47-e267-4556-aede-151cf2e0fc17-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 21 03:50:34 crc kubenswrapper[4685]: I0321 03:50:34.448695 4685 generic.go:334] "Generic (PLEG): container finished" podID="bcc97a3b-b2e8-4abe-885f-d4f79ce9a00a" containerID="c4109a761f654c4cea80457a301a9250eeca2a8dc37ed5a953f9fa2bbae1da3c" exitCode=0 Mar 21 03:50:34 crc kubenswrapper[4685]: I0321 03:50:34.449054 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6cb978c9f5-7497s" event={"ID":"bcc97a3b-b2e8-4abe-885f-d4f79ce9a00a","Type":"ContainerDied","Data":"c4109a761f654c4cea80457a301a9250eeca2a8dc37ed5a953f9fa2bbae1da3c"} Mar 21 03:50:34 crc kubenswrapper[4685]: I0321 03:50:34.449087 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6cb978c9f5-7497s" event={"ID":"bcc97a3b-b2e8-4abe-885f-d4f79ce9a00a","Type":"ContainerDied","Data":"0cee1203e73e48a68a410edf882122c22fb0a2cfbc70b6efffe9b2d5913080c2"} Mar 21 03:50:34 crc kubenswrapper[4685]: I0321 03:50:34.449101 4685 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0cee1203e73e48a68a410edf882122c22fb0a2cfbc70b6efffe9b2d5913080c2" Mar 21 03:50:34 crc kubenswrapper[4685]: I0321 03:50:34.451906 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 21 03:50:34 crc kubenswrapper[4685]: I0321 03:50:34.451964 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"ea872c47-e267-4556-aede-151cf2e0fc17","Type":"ContainerDied","Data":"0b4be67649316f01904cb294dcce6dff4f391171e651f4f070cf194e3a60e5af"} Mar 21 03:50:34 crc kubenswrapper[4685]: I0321 03:50:34.451984 4685 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b4be67649316f01904cb294dcce6dff4f391171e651f4f070cf194e3a60e5af" Mar 21 03:50:34 crc kubenswrapper[4685]: I0321 03:50:34.451996 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-9ff898d84-mcxth" podUID="80c17df6-fec5-457a-a362-6b3f7e6e7bef" containerName="controller-manager" containerID="cri-o://bc6fdd2e4dd3ccc0ae5f24da0413313612131fcc23534db6537a83fd14bb4700" gracePeriod=30 Mar 21 03:50:34 crc kubenswrapper[4685]: I0321 03:50:34.462265 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6cb978c9f5-7497s" Mar 21 03:50:34 crc kubenswrapper[4685]: I0321 03:50:34.584344 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6r7xh\" (UniqueName: \"kubernetes.io/projected/bcc97a3b-b2e8-4abe-885f-d4f79ce9a00a-kube-api-access-6r7xh\") pod \"bcc97a3b-b2e8-4abe-885f-d4f79ce9a00a\" (UID: \"bcc97a3b-b2e8-4abe-885f-d4f79ce9a00a\") " Mar 21 03:50:34 crc kubenswrapper[4685]: I0321 03:50:34.584544 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bcc97a3b-b2e8-4abe-885f-d4f79ce9a00a-client-ca\") pod \"bcc97a3b-b2e8-4abe-885f-d4f79ce9a00a\" (UID: \"bcc97a3b-b2e8-4abe-885f-d4f79ce9a00a\") " Mar 21 03:50:34 crc kubenswrapper[4685]: I0321 03:50:34.584656 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcc97a3b-b2e8-4abe-885f-d4f79ce9a00a-config\") pod \"bcc97a3b-b2e8-4abe-885f-d4f79ce9a00a\" (UID: \"bcc97a3b-b2e8-4abe-885f-d4f79ce9a00a\") " Mar 21 03:50:34 crc kubenswrapper[4685]: I0321 03:50:34.584690 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bcc97a3b-b2e8-4abe-885f-d4f79ce9a00a-serving-cert\") pod \"bcc97a3b-b2e8-4abe-885f-d4f79ce9a00a\" (UID: \"bcc97a3b-b2e8-4abe-885f-d4f79ce9a00a\") " Mar 21 03:50:34 crc kubenswrapper[4685]: I0321 03:50:34.585559 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bcc97a3b-b2e8-4abe-885f-d4f79ce9a00a-client-ca" (OuterVolumeSpecName: "client-ca") pod "bcc97a3b-b2e8-4abe-885f-d4f79ce9a00a" (UID: "bcc97a3b-b2e8-4abe-885f-d4f79ce9a00a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:50:34 crc kubenswrapper[4685]: I0321 03:50:34.585772 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bcc97a3b-b2e8-4abe-885f-d4f79ce9a00a-config" (OuterVolumeSpecName: "config") pod "bcc97a3b-b2e8-4abe-885f-d4f79ce9a00a" (UID: "bcc97a3b-b2e8-4abe-885f-d4f79ce9a00a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:50:34 crc kubenswrapper[4685]: I0321 03:50:34.588876 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcc97a3b-b2e8-4abe-885f-d4f79ce9a00a-kube-api-access-6r7xh" (OuterVolumeSpecName: "kube-api-access-6r7xh") pod "bcc97a3b-b2e8-4abe-885f-d4f79ce9a00a" (UID: "bcc97a3b-b2e8-4abe-885f-d4f79ce9a00a"). InnerVolumeSpecName "kube-api-access-6r7xh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:50:34 crc kubenswrapper[4685]: I0321 03:50:34.590271 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcc97a3b-b2e8-4abe-885f-d4f79ce9a00a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bcc97a3b-b2e8-4abe-885f-d4f79ce9a00a" (UID: "bcc97a3b-b2e8-4abe-885f-d4f79ce9a00a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 03:50:34 crc kubenswrapper[4685]: I0321 03:50:34.682919 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 21 03:50:34 crc kubenswrapper[4685]: E0321 03:50:34.683244 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a68d63c-113c-4421-9444-78d05d636874" containerName="oc" Mar 21 03:50:34 crc kubenswrapper[4685]: I0321 03:50:34.683257 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a68d63c-113c-4421-9444-78d05d636874" containerName="oc" Mar 21 03:50:34 crc kubenswrapper[4685]: E0321 03:50:34.683265 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcc97a3b-b2e8-4abe-885f-d4f79ce9a00a" containerName="route-controller-manager" Mar 21 03:50:34 crc kubenswrapper[4685]: I0321 03:50:34.683274 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcc97a3b-b2e8-4abe-885f-d4f79ce9a00a" containerName="route-controller-manager" Mar 21 03:50:34 crc kubenswrapper[4685]: E0321 03:50:34.683285 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea872c47-e267-4556-aede-151cf2e0fc17" containerName="pruner" Mar 21 03:50:34 crc kubenswrapper[4685]: I0321 03:50:34.683293 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea872c47-e267-4556-aede-151cf2e0fc17" containerName="pruner" Mar 21 03:50:34 crc kubenswrapper[4685]: E0321 03:50:34.683308 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ede3f08-f29b-4cb9-a96f-1c66239498f6" containerName="oc" Mar 21 03:50:34 crc kubenswrapper[4685]: I0321 03:50:34.683316 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ede3f08-f29b-4cb9-a96f-1c66239498f6" containerName="oc" Mar 21 03:50:34 crc kubenswrapper[4685]: I0321 03:50:34.683426 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcc97a3b-b2e8-4abe-885f-d4f79ce9a00a" containerName="route-controller-manager" Mar 21 03:50:34 crc kubenswrapper[4685]: I0321 03:50:34.683439 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea872c47-e267-4556-aede-151cf2e0fc17" containerName="pruner" Mar 21 03:50:34 crc kubenswrapper[4685]: I0321 03:50:34.683450 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ede3f08-f29b-4cb9-a96f-1c66239498f6" containerName="oc" Mar 21 03:50:34 crc kubenswrapper[4685]: I0321 03:50:34.683459 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a68d63c-113c-4421-9444-78d05d636874" containerName="oc" Mar 21 03:50:34 crc kubenswrapper[4685]: I0321 03:50:34.683879 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 21 03:50:34 crc kubenswrapper[4685]: I0321 03:50:34.686461 4685 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bcc97a3b-b2e8-4abe-885f-d4f79ce9a00a-client-ca\") on node \"crc\" DevicePath \"\"" Mar 21 03:50:34 crc kubenswrapper[4685]: I0321 03:50:34.686494 4685 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcc97a3b-b2e8-4abe-885f-d4f79ce9a00a-config\") on node \"crc\" DevicePath \"\"" Mar 21 03:50:34 crc kubenswrapper[4685]: I0321 03:50:34.686503 4685 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bcc97a3b-b2e8-4abe-885f-d4f79ce9a00a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 03:50:34 crc kubenswrapper[4685]: I0321 03:50:34.686514 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6r7xh\" (UniqueName: \"kubernetes.io/projected/bcc97a3b-b2e8-4abe-885f-d4f79ce9a00a-kube-api-access-6r7xh\") on node \"crc\" DevicePath \"\"" Mar 21 03:50:34 crc kubenswrapper[4685]: I0321 03:50:34.686594 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 21 03:50:34 crc kubenswrapper[4685]: I0321 03:50:34.686600 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 21 03:50:34 crc kubenswrapper[4685]: I0321 03:50:34.691151 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 21 03:50:34 crc kubenswrapper[4685]: I0321 03:50:34.691235 4685 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2dcgr" Mar 21 03:50:34 crc kubenswrapper[4685]: I0321 03:50:34.691354 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2dcgr" Mar 21 03:50:34 crc kubenswrapper[4685]: I0321 03:50:34.787254 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7b92b45a-4c33-4463-96c1-0d4227d1d118-var-lock\") pod \"installer-9-crc\" (UID: \"7b92b45a-4c33-4463-96c1-0d4227d1d118\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 21 03:50:34 crc kubenswrapper[4685]: I0321 03:50:34.787301 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7b92b45a-4c33-4463-96c1-0d4227d1d118-kubelet-dir\") pod \"installer-9-crc\" (UID: \"7b92b45a-4c33-4463-96c1-0d4227d1d118\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 21 03:50:34 crc kubenswrapper[4685]: I0321 03:50:34.787329 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7b92b45a-4c33-4463-96c1-0d4227d1d118-kube-api-access\") pod \"installer-9-crc\" (UID: \"7b92b45a-4c33-4463-96c1-0d4227d1d118\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 21 03:50:34 crc kubenswrapper[4685]: I0321 03:50:34.839341 4685 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2dcgr" Mar 21 03:50:34 crc kubenswrapper[4685]: I0321 03:50:34.888361 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7b92b45a-4c33-4463-96c1-0d4227d1d118-kube-api-access\") pod \"installer-9-crc\" (UID: \"7b92b45a-4c33-4463-96c1-0d4227d1d118\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 21 03:50:34 crc kubenswrapper[4685]: I0321 03:50:34.888482 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7b92b45a-4c33-4463-96c1-0d4227d1d118-var-lock\") pod \"installer-9-crc\" (UID: \"7b92b45a-4c33-4463-96c1-0d4227d1d118\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 21 03:50:34 crc kubenswrapper[4685]: I0321 03:50:34.888498 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7b92b45a-4c33-4463-96c1-0d4227d1d118-kubelet-dir\") pod \"installer-9-crc\" (UID: \"7b92b45a-4c33-4463-96c1-0d4227d1d118\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 21 03:50:34 crc kubenswrapper[4685]: I0321 03:50:34.888558 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7b92b45a-4c33-4463-96c1-0d4227d1d118-kubelet-dir\") pod \"installer-9-crc\" (UID: \"7b92b45a-4c33-4463-96c1-0d4227d1d118\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 21 03:50:34 crc kubenswrapper[4685]: I0321 03:50:34.888826 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7b92b45a-4c33-4463-96c1-0d4227d1d118-var-lock\") pod \"installer-9-crc\" (UID: \"7b92b45a-4c33-4463-96c1-0d4227d1d118\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 21 03:50:34 crc kubenswrapper[4685]: I0321 03:50:34.915577 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7b92b45a-4c33-4463-96c1-0d4227d1d118-kube-api-access\") pod \"installer-9-crc\" (UID: \"7b92b45a-4c33-4463-96c1-0d4227d1d118\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 21 03:50:34 crc kubenswrapper[4685]: I0321 03:50:34.927307 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-9ff898d84-mcxth" Mar 21 03:50:35 crc kubenswrapper[4685]: I0321 03:50:35.010102 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 21 03:50:35 crc kubenswrapper[4685]: I0321 03:50:35.091817 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80c17df6-fec5-457a-a362-6b3f7e6e7bef-config\") pod \"80c17df6-fec5-457a-a362-6b3f7e6e7bef\" (UID: \"80c17df6-fec5-457a-a362-6b3f7e6e7bef\") " Mar 21 03:50:35 crc kubenswrapper[4685]: I0321 03:50:35.091867 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/80c17df6-fec5-457a-a362-6b3f7e6e7bef-client-ca\") pod \"80c17df6-fec5-457a-a362-6b3f7e6e7bef\" (UID: \"80c17df6-fec5-457a-a362-6b3f7e6e7bef\") " Mar 21 03:50:35 crc kubenswrapper[4685]: I0321 03:50:35.091947 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/80c17df6-fec5-457a-a362-6b3f7e6e7bef-serving-cert\") pod \"80c17df6-fec5-457a-a362-6b3f7e6e7bef\" (UID: \"80c17df6-fec5-457a-a362-6b3f7e6e7bef\") " Mar 21 03:50:35 crc kubenswrapper[4685]: I0321 03:50:35.091971 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/80c17df6-fec5-457a-a362-6b3f7e6e7bef-proxy-ca-bundles\") pod \"80c17df6-fec5-457a-a362-6b3f7e6e7bef\" (UID: \"80c17df6-fec5-457a-a362-6b3f7e6e7bef\") " Mar 21 03:50:35 crc kubenswrapper[4685]: I0321 03:50:35.092016 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvx99\" (UniqueName: \"kubernetes.io/projected/80c17df6-fec5-457a-a362-6b3f7e6e7bef-kube-api-access-wvx99\") pod \"80c17df6-fec5-457a-a362-6b3f7e6e7bef\" (UID: \"80c17df6-fec5-457a-a362-6b3f7e6e7bef\") " Mar 21 03:50:35 crc kubenswrapper[4685]: I0321 03:50:35.092814 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80c17df6-fec5-457a-a362-6b3f7e6e7bef-config" (OuterVolumeSpecName: "config") pod "80c17df6-fec5-457a-a362-6b3f7e6e7bef" (UID: "80c17df6-fec5-457a-a362-6b3f7e6e7bef"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:50:35 crc kubenswrapper[4685]: I0321 03:50:35.093068 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80c17df6-fec5-457a-a362-6b3f7e6e7bef-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "80c17df6-fec5-457a-a362-6b3f7e6e7bef" (UID: "80c17df6-fec5-457a-a362-6b3f7e6e7bef"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:50:35 crc kubenswrapper[4685]: I0321 03:50:35.094124 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80c17df6-fec5-457a-a362-6b3f7e6e7bef-client-ca" (OuterVolumeSpecName: "client-ca") pod "80c17df6-fec5-457a-a362-6b3f7e6e7bef" (UID: "80c17df6-fec5-457a-a362-6b3f7e6e7bef"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:50:35 crc kubenswrapper[4685]: I0321 03:50:35.096578 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80c17df6-fec5-457a-a362-6b3f7e6e7bef-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "80c17df6-fec5-457a-a362-6b3f7e6e7bef" (UID: "80c17df6-fec5-457a-a362-6b3f7e6e7bef"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 03:50:35 crc kubenswrapper[4685]: I0321 03:50:35.096999 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80c17df6-fec5-457a-a362-6b3f7e6e7bef-kube-api-access-wvx99" (OuterVolumeSpecName: "kube-api-access-wvx99") pod "80c17df6-fec5-457a-a362-6b3f7e6e7bef" (UID: "80c17df6-fec5-457a-a362-6b3f7e6e7bef"). InnerVolumeSpecName "kube-api-access-wvx99". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:50:35 crc kubenswrapper[4685]: I0321 03:50:35.194481 4685 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/80c17df6-fec5-457a-a362-6b3f7e6e7bef-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 03:50:35 crc kubenswrapper[4685]: I0321 03:50:35.194531 4685 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/80c17df6-fec5-457a-a362-6b3f7e6e7bef-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 21 03:50:35 crc kubenswrapper[4685]: I0321 03:50:35.194542 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvx99\" (UniqueName: \"kubernetes.io/projected/80c17df6-fec5-457a-a362-6b3f7e6e7bef-kube-api-access-wvx99\") on node \"crc\" DevicePath \"\"" Mar 21 03:50:35 crc kubenswrapper[4685]: I0321 03:50:35.194552 4685 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80c17df6-fec5-457a-a362-6b3f7e6e7bef-config\") on node \"crc\" DevicePath \"\"" Mar 21 03:50:35 crc kubenswrapper[4685]: I0321 03:50:35.194560 4685 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/80c17df6-fec5-457a-a362-6b3f7e6e7bef-client-ca\") on node \"crc\" DevicePath \"\"" Mar 21 03:50:35 crc kubenswrapper[4685]: I0321 03:50:35.386941 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 21 03:50:35 crc kubenswrapper[4685]: I0321 03:50:35.466476 4685 generic.go:334] "Generic (PLEG): container finished" podID="80c17df6-fec5-457a-a362-6b3f7e6e7bef" containerID="bc6fdd2e4dd3ccc0ae5f24da0413313612131fcc23534db6537a83fd14bb4700" exitCode=0 Mar 21 03:50:35 crc kubenswrapper[4685]: I0321 03:50:35.466531 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-9ff898d84-mcxth" Mar 21 03:50:35 crc kubenswrapper[4685]: I0321 03:50:35.466594 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-9ff898d84-mcxth" event={"ID":"80c17df6-fec5-457a-a362-6b3f7e6e7bef","Type":"ContainerDied","Data":"bc6fdd2e4dd3ccc0ae5f24da0413313612131fcc23534db6537a83fd14bb4700"} Mar 21 03:50:35 crc kubenswrapper[4685]: I0321 03:50:35.466674 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-9ff898d84-mcxth" event={"ID":"80c17df6-fec5-457a-a362-6b3f7e6e7bef","Type":"ContainerDied","Data":"72460cf0548d60bf457848480d6e07fcd161a3d6e59ec1bd3d6c4e7b896a6d5a"} Mar 21 03:50:35 crc kubenswrapper[4685]: I0321 03:50:35.466709 4685 scope.go:117] "RemoveContainer" containerID="bc6fdd2e4dd3ccc0ae5f24da0413313612131fcc23534db6537a83fd14bb4700" Mar 21 03:50:35 crc kubenswrapper[4685]: I0321 03:50:35.469430 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"7b92b45a-4c33-4463-96c1-0d4227d1d118","Type":"ContainerStarted","Data":"e11b6aaad494ccf25935123fbf6b04d5f96caf0dc48dcb7c61293c4b47e05eed"} Mar 21 03:50:35 crc kubenswrapper[4685]: I0321 03:50:35.469492 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6cb978c9f5-7497s" Mar 21 03:50:35 crc kubenswrapper[4685]: I0321 03:50:35.490087 4685 scope.go:117] "RemoveContainer" containerID="bc6fdd2e4dd3ccc0ae5f24da0413313612131fcc23534db6537a83fd14bb4700" Mar 21 03:50:35 crc kubenswrapper[4685]: E0321 03:50:35.490562 4685 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc6fdd2e4dd3ccc0ae5f24da0413313612131fcc23534db6537a83fd14bb4700\": container with ID starting with bc6fdd2e4dd3ccc0ae5f24da0413313612131fcc23534db6537a83fd14bb4700 not found: ID does not exist" containerID="bc6fdd2e4dd3ccc0ae5f24da0413313612131fcc23534db6537a83fd14bb4700" Mar 21 03:50:35 crc kubenswrapper[4685]: I0321 03:50:35.490623 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc6fdd2e4dd3ccc0ae5f24da0413313612131fcc23534db6537a83fd14bb4700"} err="failed to get container status \"bc6fdd2e4dd3ccc0ae5f24da0413313612131fcc23534db6537a83fd14bb4700\": rpc error: code = NotFound desc = could not find container \"bc6fdd2e4dd3ccc0ae5f24da0413313612131fcc23534db6537a83fd14bb4700\": container with ID starting with bc6fdd2e4dd3ccc0ae5f24da0413313612131fcc23534db6537a83fd14bb4700 not found: ID does not exist" Mar 21 03:50:35 crc kubenswrapper[4685]: I0321 03:50:35.511439 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6cb978c9f5-7497s"] Mar 21 03:50:35 crc kubenswrapper[4685]: I0321 03:50:35.520107 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6cb978c9f5-7497s"] Mar 21 03:50:35 crc kubenswrapper[4685]: I0321 03:50:35.527012 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-9ff898d84-mcxth"] Mar 21 03:50:35 crc kubenswrapper[4685]: I0321 03:50:35.528946 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-9ff898d84-mcxth"] Mar 21 03:50:36 crc kubenswrapper[4685]: I0321 03:50:36.307817 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80c17df6-fec5-457a-a362-6b3f7e6e7bef" path="/var/lib/kubelet/pods/80c17df6-fec5-457a-a362-6b3f7e6e7bef/volumes" Mar 21 03:50:36 crc kubenswrapper[4685]: I0321 03:50:36.308801 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bcc97a3b-b2e8-4abe-885f-d4f79ce9a00a" path="/var/lib/kubelet/pods/bcc97a3b-b2e8-4abe-885f-d4f79ce9a00a/volumes" Mar 21 03:50:36 crc kubenswrapper[4685]: I0321 03:50:36.476490 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"7b92b45a-4c33-4463-96c1-0d4227d1d118","Type":"ContainerStarted","Data":"8685673bd128e91f0736b69d2792898a866d876fcab17106ff676fa213f77a21"} Mar 21 03:50:36 crc kubenswrapper[4685]: I0321 03:50:36.497887 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.497855759 podStartE2EDuration="2.497855759s" podCreationTimestamp="2026-03-21 03:50:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 03:50:36.496693113 +0000 UTC m=+268.973761925" watchObservedRunningTime="2026-03-21 03:50:36.497855759 +0000 UTC m=+268.974924551" Mar 21 03:50:36 crc kubenswrapper[4685]: I0321 03:50:36.522245 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-866b8556c4-tvtvg"] Mar 21 03:50:36 crc kubenswrapper[4685]: E0321 03:50:36.522527 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80c17df6-fec5-457a-a362-6b3f7e6e7bef" containerName="controller-manager" Mar 21 03:50:36 crc kubenswrapper[4685]: I0321 03:50:36.522543 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="80c17df6-fec5-457a-a362-6b3f7e6e7bef" containerName="controller-manager" Mar 21 03:50:36 crc kubenswrapper[4685]: I0321 03:50:36.522692 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="80c17df6-fec5-457a-a362-6b3f7e6e7bef" containerName="controller-manager" Mar 21 03:50:36 crc kubenswrapper[4685]: I0321 03:50:36.523250 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-866b8556c4-tvtvg" Mar 21 03:50:36 crc kubenswrapper[4685]: I0321 03:50:36.527125 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 21 03:50:36 crc kubenswrapper[4685]: I0321 03:50:36.527269 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 21 03:50:36 crc kubenswrapper[4685]: I0321 03:50:36.527309 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 21 03:50:36 crc kubenswrapper[4685]: I0321 03:50:36.527465 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 21 03:50:36 crc kubenswrapper[4685]: I0321 03:50:36.527676 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 21 03:50:36 crc kubenswrapper[4685]: I0321 03:50:36.527746 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 21 03:50:36 crc kubenswrapper[4685]: I0321 03:50:36.533244 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-856b8c6d67-q7b7q"] Mar 21 03:50:36 crc kubenswrapper[4685]: I0321 03:50:36.534062 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-856b8c6d67-q7b7q" Mar 21 03:50:36 crc kubenswrapper[4685]: I0321 03:50:36.535404 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 21 03:50:36 crc kubenswrapper[4685]: I0321 03:50:36.536293 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 21 03:50:36 crc kubenswrapper[4685]: I0321 03:50:36.536319 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 21 03:50:36 crc kubenswrapper[4685]: I0321 03:50:36.537369 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 21 03:50:36 crc kubenswrapper[4685]: I0321 03:50:36.538753 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 21 03:50:36 crc kubenswrapper[4685]: I0321 03:50:36.538983 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 21 03:50:36 crc kubenswrapper[4685]: I0321 03:50:36.539664 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-866b8556c4-tvtvg"] Mar 21 03:50:36 crc kubenswrapper[4685]: I0321 03:50:36.545126 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 21 03:50:36 crc kubenswrapper[4685]: I0321 03:50:36.554599 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2dcgr" Mar 21 03:50:36 crc kubenswrapper[4685]: I0321 03:50:36.556165 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-856b8c6d67-q7b7q"] Mar 21 03:50:36 crc kubenswrapper[4685]: I0321 03:50:36.605186 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2dcgr"] Mar 21 03:50:36 crc kubenswrapper[4685]: I0321 03:50:36.712158 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d5321225-ce62-44de-bb49-821ec1823946-proxy-ca-bundles\") pod \"controller-manager-866b8556c4-tvtvg\" (UID: \"d5321225-ce62-44de-bb49-821ec1823946\") " pod="openshift-controller-manager/controller-manager-866b8556c4-tvtvg" Mar 21 03:50:36 crc kubenswrapper[4685]: I0321 03:50:36.712211 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/991e3873-a3a7-4195-9323-606a14313a5a-serving-cert\") pod \"route-controller-manager-856b8c6d67-q7b7q\" (UID: \"991e3873-a3a7-4195-9323-606a14313a5a\") " pod="openshift-route-controller-manager/route-controller-manager-856b8c6d67-q7b7q" Mar 21 03:50:36 crc kubenswrapper[4685]: I0321 03:50:36.712300 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d5321225-ce62-44de-bb49-821ec1823946-serving-cert\") pod \"controller-manager-866b8556c4-tvtvg\" (UID: \"d5321225-ce62-44de-bb49-821ec1823946\") " pod="openshift-controller-manager/controller-manager-866b8556c4-tvtvg" Mar 21 03:50:36 crc kubenswrapper[4685]: I0321 03:50:36.712329 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgvfx\" (UniqueName: \"kubernetes.io/projected/d5321225-ce62-44de-bb49-821ec1823946-kube-api-access-fgvfx\") pod \"controller-manager-866b8556c4-tvtvg\" (UID: \"d5321225-ce62-44de-bb49-821ec1823946\") " pod="openshift-controller-manager/controller-manager-866b8556c4-tvtvg" Mar 21 03:50:36 crc kubenswrapper[4685]: I0321 03:50:36.712361 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/991e3873-a3a7-4195-9323-606a14313a5a-config\") pod \"route-controller-manager-856b8c6d67-q7b7q\" (UID: \"991e3873-a3a7-4195-9323-606a14313a5a\") " pod="openshift-route-controller-manager/route-controller-manager-856b8c6d67-q7b7q" Mar 21 03:50:36 crc kubenswrapper[4685]: I0321 03:50:36.712400 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/991e3873-a3a7-4195-9323-606a14313a5a-client-ca\") pod \"route-controller-manager-856b8c6d67-q7b7q\" (UID: \"991e3873-a3a7-4195-9323-606a14313a5a\") " pod="openshift-route-controller-manager/route-controller-manager-856b8c6d67-q7b7q" Mar 21 03:50:36 crc kubenswrapper[4685]: I0321 03:50:36.712426 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kg5nc\" (UniqueName: \"kubernetes.io/projected/991e3873-a3a7-4195-9323-606a14313a5a-kube-api-access-kg5nc\") pod \"route-controller-manager-856b8c6d67-q7b7q\" (UID: \"991e3873-a3a7-4195-9323-606a14313a5a\") " pod="openshift-route-controller-manager/route-controller-manager-856b8c6d67-q7b7q" Mar 21 03:50:36 crc kubenswrapper[4685]: I0321 03:50:36.712452 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d5321225-ce62-44de-bb49-821ec1823946-client-ca\") pod \"controller-manager-866b8556c4-tvtvg\" (UID: \"d5321225-ce62-44de-bb49-821ec1823946\") " pod="openshift-controller-manager/controller-manager-866b8556c4-tvtvg" Mar 21 03:50:36 crc kubenswrapper[4685]: I0321 03:50:36.712477 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5321225-ce62-44de-bb49-821ec1823946-config\") pod \"controller-manager-866b8556c4-tvtvg\" (UID: \"d5321225-ce62-44de-bb49-821ec1823946\") " pod="openshift-controller-manager/controller-manager-866b8556c4-tvtvg" Mar 21 03:50:36 crc kubenswrapper[4685]: I0321 03:50:36.813810 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d5321225-ce62-44de-bb49-821ec1823946-proxy-ca-bundles\") pod \"controller-manager-866b8556c4-tvtvg\" (UID: \"d5321225-ce62-44de-bb49-821ec1823946\") " pod="openshift-controller-manager/controller-manager-866b8556c4-tvtvg" Mar 21 03:50:36 crc kubenswrapper[4685]: I0321 03:50:36.813875 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/991e3873-a3a7-4195-9323-606a14313a5a-serving-cert\") pod \"route-controller-manager-856b8c6d67-q7b7q\" (UID: \"991e3873-a3a7-4195-9323-606a14313a5a\") " pod="openshift-route-controller-manager/route-controller-manager-856b8c6d67-q7b7q" Mar 21 03:50:36 crc kubenswrapper[4685]: I0321 03:50:36.813912 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d5321225-ce62-44de-bb49-821ec1823946-serving-cert\") pod \"controller-manager-866b8556c4-tvtvg\" (UID: \"d5321225-ce62-44de-bb49-821ec1823946\") " pod="openshift-controller-manager/controller-manager-866b8556c4-tvtvg" Mar 21 03:50:36 crc kubenswrapper[4685]: I0321 03:50:36.813931 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgvfx\" (UniqueName: \"kubernetes.io/projected/d5321225-ce62-44de-bb49-821ec1823946-kube-api-access-fgvfx\") pod \"controller-manager-866b8556c4-tvtvg\" (UID: \"d5321225-ce62-44de-bb49-821ec1823946\") " pod="openshift-controller-manager/controller-manager-866b8556c4-tvtvg" Mar 21 03:50:36 crc kubenswrapper[4685]: I0321 03:50:36.813956 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/991e3873-a3a7-4195-9323-606a14313a5a-config\") pod \"route-controller-manager-856b8c6d67-q7b7q\" (UID: \"991e3873-a3a7-4195-9323-606a14313a5a\") " pod="openshift-route-controller-manager/route-controller-manager-856b8c6d67-q7b7q" Mar 21 03:50:36 crc kubenswrapper[4685]: I0321 03:50:36.813983 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/991e3873-a3a7-4195-9323-606a14313a5a-client-ca\") pod \"route-controller-manager-856b8c6d67-q7b7q\" (UID: \"991e3873-a3a7-4195-9323-606a14313a5a\") " pod="openshift-route-controller-manager/route-controller-manager-856b8c6d67-q7b7q" Mar 21 03:50:36 crc kubenswrapper[4685]: I0321 03:50:36.814004 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kg5nc\" (UniqueName: \"kubernetes.io/projected/991e3873-a3a7-4195-9323-606a14313a5a-kube-api-access-kg5nc\") pod \"route-controller-manager-856b8c6d67-q7b7q\" (UID: \"991e3873-a3a7-4195-9323-606a14313a5a\") " pod="openshift-route-controller-manager/route-controller-manager-856b8c6d67-q7b7q" Mar 21 03:50:36 crc kubenswrapper[4685]: I0321 03:50:36.814019 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d5321225-ce62-44de-bb49-821ec1823946-client-ca\") pod \"controller-manager-866b8556c4-tvtvg\" (UID: \"d5321225-ce62-44de-bb49-821ec1823946\") " pod="openshift-controller-manager/controller-manager-866b8556c4-tvtvg" Mar 21 03:50:36 crc kubenswrapper[4685]: I0321 03:50:36.814036 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5321225-ce62-44de-bb49-821ec1823946-config\") pod \"controller-manager-866b8556c4-tvtvg\" (UID: \"d5321225-ce62-44de-bb49-821ec1823946\") " pod="openshift-controller-manager/controller-manager-866b8556c4-tvtvg" Mar 21 03:50:36 crc kubenswrapper[4685]: I0321 03:50:36.815200 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5321225-ce62-44de-bb49-821ec1823946-config\") pod \"controller-manager-866b8556c4-tvtvg\" (UID: \"d5321225-ce62-44de-bb49-821ec1823946\") " pod="openshift-controller-manager/controller-manager-866b8556c4-tvtvg" Mar 21 03:50:36 crc kubenswrapper[4685]: I0321 03:50:36.816212 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d5321225-ce62-44de-bb49-821ec1823946-proxy-ca-bundles\") pod \"controller-manager-866b8556c4-tvtvg\" (UID: \"d5321225-ce62-44de-bb49-821ec1823946\") " pod="openshift-controller-manager/controller-manager-866b8556c4-tvtvg" Mar 21 03:50:36 crc kubenswrapper[4685]: I0321 03:50:36.817179 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/991e3873-a3a7-4195-9323-606a14313a5a-client-ca\") pod \"route-controller-manager-856b8c6d67-q7b7q\" (UID: \"991e3873-a3a7-4195-9323-606a14313a5a\") " pod="openshift-route-controller-manager/route-controller-manager-856b8c6d67-q7b7q" Mar 21 03:50:36 crc kubenswrapper[4685]: I0321 03:50:36.817443 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/991e3873-a3a7-4195-9323-606a14313a5a-config\") pod \"route-controller-manager-856b8c6d67-q7b7q\" (UID: \"991e3873-a3a7-4195-9323-606a14313a5a\") " pod="openshift-route-controller-manager/route-controller-manager-856b8c6d67-q7b7q" Mar 21 03:50:36 crc kubenswrapper[4685]: I0321 03:50:36.817659 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d5321225-ce62-44de-bb49-821ec1823946-client-ca\") pod \"controller-manager-866b8556c4-tvtvg\" (UID: \"d5321225-ce62-44de-bb49-821ec1823946\") " pod="openshift-controller-manager/controller-manager-866b8556c4-tvtvg" Mar 21 03:50:36 crc kubenswrapper[4685]: I0321 03:50:36.824572 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/991e3873-a3a7-4195-9323-606a14313a5a-serving-cert\") pod \"route-controller-manager-856b8c6d67-q7b7q\" (UID: \"991e3873-a3a7-4195-9323-606a14313a5a\") " pod="openshift-route-controller-manager/route-controller-manager-856b8c6d67-q7b7q" Mar 21 03:50:36 crc kubenswrapper[4685]: I0321 03:50:36.825831 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d5321225-ce62-44de-bb49-821ec1823946-serving-cert\") pod \"controller-manager-866b8556c4-tvtvg\" (UID: \"d5321225-ce62-44de-bb49-821ec1823946\") " pod="openshift-controller-manager/controller-manager-866b8556c4-tvtvg" Mar 21 03:50:36 crc kubenswrapper[4685]: I0321 03:50:36.832656 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kg5nc\" (UniqueName: \"kubernetes.io/projected/991e3873-a3a7-4195-9323-606a14313a5a-kube-api-access-kg5nc\") pod \"route-controller-manager-856b8c6d67-q7b7q\" (UID: \"991e3873-a3a7-4195-9323-606a14313a5a\") " pod="openshift-route-controller-manager/route-controller-manager-856b8c6d67-q7b7q" Mar 21 03:50:36 crc kubenswrapper[4685]: I0321 03:50:36.834103 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgvfx\" (UniqueName: \"kubernetes.io/projected/d5321225-ce62-44de-bb49-821ec1823946-kube-api-access-fgvfx\") pod \"controller-manager-866b8556c4-tvtvg\" (UID: \"d5321225-ce62-44de-bb49-821ec1823946\") " pod="openshift-controller-manager/controller-manager-866b8556c4-tvtvg" Mar 21 03:50:36 crc kubenswrapper[4685]: I0321 03:50:36.870997 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-866b8556c4-tvtvg" Mar 21 03:50:36 crc kubenswrapper[4685]: I0321 03:50:36.880671 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-856b8c6d67-q7b7q" Mar 21 03:50:38 crc kubenswrapper[4685]: I0321 03:50:38.486134 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2dcgr" podUID="f7b276b3-8b85-4cfe-a39c-73da270336e3" containerName="registry-server" containerID="cri-o://10cac42124e6af2d450cc0c4f55f51011571dfaac34b721984bfa9ad061b9436" gracePeriod=2 Mar 21 03:50:39 crc kubenswrapper[4685]: I0321 03:50:39.494452 4685 generic.go:334] "Generic (PLEG): container finished" podID="f7b276b3-8b85-4cfe-a39c-73da270336e3" containerID="10cac42124e6af2d450cc0c4f55f51011571dfaac34b721984bfa9ad061b9436" exitCode=0 Mar 21 03:50:39 crc kubenswrapper[4685]: I0321 03:50:39.494803 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2dcgr" event={"ID":"f7b276b3-8b85-4cfe-a39c-73da270336e3","Type":"ContainerDied","Data":"10cac42124e6af2d450cc0c4f55f51011571dfaac34b721984bfa9ad061b9436"} Mar 21 03:50:39 crc kubenswrapper[4685]: I0321 03:50:39.687211 4685 patch_prober.go:28] interesting pod/machine-config-daemon-7r9cg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 03:50:39 crc kubenswrapper[4685]: I0321 03:50:39.687291 4685 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" podUID="cea46fe2-4e41-43ab-a069-cb30fb4e732c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 03:50:39 crc kubenswrapper[4685]: I0321 03:50:39.687354 4685 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" Mar 21 03:50:39 crc kubenswrapper[4685]: I0321 03:50:39.688222 4685 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"682dda84970818843427fd441cf0359877cff12a6a10a7f7fad13d60c5ca62f3"} pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 21 03:50:39 crc kubenswrapper[4685]: I0321 03:50:39.688308 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" podUID="cea46fe2-4e41-43ab-a069-cb30fb4e732c" containerName="machine-config-daemon" containerID="cri-o://682dda84970818843427fd441cf0359877cff12a6a10a7f7fad13d60c5ca62f3" gracePeriod=600 Mar 21 03:50:39 crc kubenswrapper[4685]: I0321 03:50:39.845282 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-866b8556c4-tvtvg"] Mar 21 03:50:39 crc kubenswrapper[4685]: I0321 03:50:39.912270 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-856b8c6d67-q7b7q"] Mar 21 03:50:39 crc kubenswrapper[4685]: W0321 03:50:39.937448 4685 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod991e3873_a3a7_4195_9323_606a14313a5a.slice/crio-d8b74c61366b8e8880939f713344df6c68d93acc279a6eb4863c1678477491c8 WatchSource:0}: Error finding container d8b74c61366b8e8880939f713344df6c68d93acc279a6eb4863c1678477491c8: Status 404 returned error can't find the container with id d8b74c61366b8e8880939f713344df6c68d93acc279a6eb4863c1678477491c8 Mar 21 03:50:40 crc kubenswrapper[4685]: I0321 03:50:40.080700 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2dcgr" Mar 21 03:50:40 crc kubenswrapper[4685]: I0321 03:50:40.257134 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7b276b3-8b85-4cfe-a39c-73da270336e3-utilities\") pod \"f7b276b3-8b85-4cfe-a39c-73da270336e3\" (UID: \"f7b276b3-8b85-4cfe-a39c-73da270336e3\") " Mar 21 03:50:40 crc kubenswrapper[4685]: I0321 03:50:40.257599 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7dncj\" (UniqueName: \"kubernetes.io/projected/f7b276b3-8b85-4cfe-a39c-73da270336e3-kube-api-access-7dncj\") pod \"f7b276b3-8b85-4cfe-a39c-73da270336e3\" (UID: \"f7b276b3-8b85-4cfe-a39c-73da270336e3\") " Mar 21 03:50:40 crc kubenswrapper[4685]: I0321 03:50:40.257623 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7b276b3-8b85-4cfe-a39c-73da270336e3-catalog-content\") pod \"f7b276b3-8b85-4cfe-a39c-73da270336e3\" (UID: \"f7b276b3-8b85-4cfe-a39c-73da270336e3\") " Mar 21 03:50:40 crc kubenswrapper[4685]: I0321 03:50:40.258341 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7b276b3-8b85-4cfe-a39c-73da270336e3-utilities" (OuterVolumeSpecName: "utilities") pod "f7b276b3-8b85-4cfe-a39c-73da270336e3" (UID: "f7b276b3-8b85-4cfe-a39c-73da270336e3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 03:50:40 crc kubenswrapper[4685]: I0321 03:50:40.264180 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7b276b3-8b85-4cfe-a39c-73da270336e3-kube-api-access-7dncj" (OuterVolumeSpecName: "kube-api-access-7dncj") pod "f7b276b3-8b85-4cfe-a39c-73da270336e3" (UID: "f7b276b3-8b85-4cfe-a39c-73da270336e3"). InnerVolumeSpecName "kube-api-access-7dncj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:50:40 crc kubenswrapper[4685]: I0321 03:50:40.316127 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7b276b3-8b85-4cfe-a39c-73da270336e3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f7b276b3-8b85-4cfe-a39c-73da270336e3" (UID: "f7b276b3-8b85-4cfe-a39c-73da270336e3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 03:50:40 crc kubenswrapper[4685]: I0321 03:50:40.358750 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7dncj\" (UniqueName: \"kubernetes.io/projected/f7b276b3-8b85-4cfe-a39c-73da270336e3-kube-api-access-7dncj\") on node \"crc\" DevicePath \"\"" Mar 21 03:50:40 crc kubenswrapper[4685]: I0321 03:50:40.358783 4685 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7b276b3-8b85-4cfe-a39c-73da270336e3-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 03:50:40 crc kubenswrapper[4685]: I0321 03:50:40.358794 4685 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7b276b3-8b85-4cfe-a39c-73da270336e3-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 03:50:40 crc kubenswrapper[4685]: I0321 03:50:40.503337 4685 generic.go:334] "Generic (PLEG): container finished" podID="cea46fe2-4e41-43ab-a069-cb30fb4e732c" containerID="682dda84970818843427fd441cf0359877cff12a6a10a7f7fad13d60c5ca62f3" exitCode=0 Mar 21 03:50:40 crc kubenswrapper[4685]: I0321 03:50:40.503426 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" event={"ID":"cea46fe2-4e41-43ab-a069-cb30fb4e732c","Type":"ContainerDied","Data":"682dda84970818843427fd441cf0359877cff12a6a10a7f7fad13d60c5ca62f3"} Mar 21 03:50:40 crc kubenswrapper[4685]: I0321 03:50:40.503746 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" event={"ID":"cea46fe2-4e41-43ab-a069-cb30fb4e732c","Type":"ContainerStarted","Data":"8ae2ea0f2d37402c06f62c37b02e2377743ae0dc80e5e3ec752094ab6ef40392"} Mar 21 03:50:40 crc kubenswrapper[4685]: I0321 03:50:40.506658 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-856b8c6d67-q7b7q" event={"ID":"991e3873-a3a7-4195-9323-606a14313a5a","Type":"ContainerStarted","Data":"4795f906f8314a5957ff1c3e1c0fa30773d974eca6cf3eaac417cb7b2d13d120"} Mar 21 03:50:40 crc kubenswrapper[4685]: I0321 03:50:40.506698 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-856b8c6d67-q7b7q" event={"ID":"991e3873-a3a7-4195-9323-606a14313a5a","Type":"ContainerStarted","Data":"d8b74c61366b8e8880939f713344df6c68d93acc279a6eb4863c1678477491c8"} Mar 21 03:50:40 crc kubenswrapper[4685]: I0321 03:50:40.506852 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-856b8c6d67-q7b7q" Mar 21 03:50:40 crc kubenswrapper[4685]: I0321 03:50:40.509451 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lnbh8" event={"ID":"4d6c412c-bc35-4360-91b0-06f8b60e7106","Type":"ContainerStarted","Data":"f7816f09862dfe2024ef340f42986a62c4e444dc50da83fe6d3101a439d9efca"} Mar 21 03:50:40 crc kubenswrapper[4685]: I0321 03:50:40.510857 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-866b8556c4-tvtvg" event={"ID":"d5321225-ce62-44de-bb49-821ec1823946","Type":"ContainerStarted","Data":"5ca9a11f9078ce21f842573c019b6109d22d8ea2b1d0f79dbf0587ae599bb87c"} Mar 21 03:50:40 crc kubenswrapper[4685]: I0321 03:50:40.510889 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-866b8556c4-tvtvg" event={"ID":"d5321225-ce62-44de-bb49-821ec1823946","Type":"ContainerStarted","Data":"524c2e261280c095be9ac5d50861c9e472d482d058241042672e9fdf696e7359"} Mar 21 03:50:40 crc kubenswrapper[4685]: I0321 03:50:40.511263 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-866b8556c4-tvtvg" Mar 21 03:50:40 crc kubenswrapper[4685]: I0321 03:50:40.518180 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2dcgr" event={"ID":"f7b276b3-8b85-4cfe-a39c-73da270336e3","Type":"ContainerDied","Data":"764e4d787c3552c5a132db4bb0c6be5e3a7f4c1ae5ac6d13e8c12b6397479ad1"} Mar 21 03:50:40 crc kubenswrapper[4685]: I0321 03:50:40.518223 4685 scope.go:117] "RemoveContainer" containerID="10cac42124e6af2d450cc0c4f55f51011571dfaac34b721984bfa9ad061b9436" Mar 21 03:50:40 crc kubenswrapper[4685]: I0321 03:50:40.518338 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2dcgr" Mar 21 03:50:40 crc kubenswrapper[4685]: I0321 03:50:40.531812 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-866b8556c4-tvtvg" Mar 21 03:50:40 crc kubenswrapper[4685]: I0321 03:50:40.533785 4685 scope.go:117] "RemoveContainer" containerID="2266ae0d76cc2d2073b2a0a2ac500b2377c0783e8373ad6fe093726bd17ef0aa" Mar 21 03:50:40 crc kubenswrapper[4685]: I0321 03:50:40.547226 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2dcgr"] Mar 21 03:50:40 crc kubenswrapper[4685]: I0321 03:50:40.549515 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2dcgr"] Mar 21 03:50:40 crc kubenswrapper[4685]: I0321 03:50:40.564989 4685 scope.go:117] "RemoveContainer" containerID="ec70bcfacbcb092a27deb4937e29edf6b929d2ce6427e3f3d4290177836a63d5" Mar 21 03:50:40 crc kubenswrapper[4685]: I0321 03:50:40.570089 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-856b8c6d67-q7b7q" podStartSLOduration=8.570069955 podStartE2EDuration="8.570069955s" podCreationTimestamp="2026-03-21 03:50:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 03:50:40.563081333 +0000 UTC m=+273.040150135" watchObservedRunningTime="2026-03-21 03:50:40.570069955 +0000 UTC m=+273.047138747" Mar 21 03:50:40 crc kubenswrapper[4685]: I0321 03:50:40.604439 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-866b8556c4-tvtvg" podStartSLOduration=8.604362205 podStartE2EDuration="8.604362205s" podCreationTimestamp="2026-03-21 03:50:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 03:50:40.593435224 +0000 UTC m=+273.070504016" watchObservedRunningTime="2026-03-21 03:50:40.604362205 +0000 UTC m=+273.081430997" Mar 21 03:50:41 crc kubenswrapper[4685]: I0321 03:50:41.118500 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-856b8c6d67-q7b7q" Mar 21 03:50:41 crc kubenswrapper[4685]: I0321 03:50:41.524758 4685 generic.go:334] "Generic (PLEG): container finished" podID="d4ebdf18-8426-42cc-93a6-60b46261aebe" containerID="d478a553216614720b07cd59fe60242ee94a243f67d0ea92b29ac48a5c59b7e0" exitCode=0 Mar 21 03:50:41 crc kubenswrapper[4685]: I0321 03:50:41.524885 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7swf2" event={"ID":"d4ebdf18-8426-42cc-93a6-60b46261aebe","Type":"ContainerDied","Data":"d478a553216614720b07cd59fe60242ee94a243f67d0ea92b29ac48a5c59b7e0"} Mar 21 03:50:41 crc kubenswrapper[4685]: I0321 03:50:41.531877 4685 generic.go:334] "Generic (PLEG): container finished" podID="4d6c412c-bc35-4360-91b0-06f8b60e7106" containerID="f7816f09862dfe2024ef340f42986a62c4e444dc50da83fe6d3101a439d9efca" exitCode=0 Mar 21 03:50:41 crc kubenswrapper[4685]: I0321 03:50:41.531965 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lnbh8" event={"ID":"4d6c412c-bc35-4360-91b0-06f8b60e7106","Type":"ContainerDied","Data":"f7816f09862dfe2024ef340f42986a62c4e444dc50da83fe6d3101a439d9efca"} Mar 21 03:50:41 crc kubenswrapper[4685]: I0321 03:50:41.535859 4685 generic.go:334] "Generic (PLEG): container finished" podID="855b8c82-585a-4883-acdc-195377b480c2" containerID="ff6ba428faf1dd45dd1cca847ad4617672ff88cd427a323bebca1b9be6a3f2d9" exitCode=0 Mar 21 03:50:41 crc kubenswrapper[4685]: I0321 03:50:41.535904 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8qq2k" event={"ID":"855b8c82-585a-4883-acdc-195377b480c2","Type":"ContainerDied","Data":"ff6ba428faf1dd45dd1cca847ad4617672ff88cd427a323bebca1b9be6a3f2d9"} Mar 21 03:50:42 crc kubenswrapper[4685]: I0321 03:50:42.310465 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7b276b3-8b85-4cfe-a39c-73da270336e3" path="/var/lib/kubelet/pods/f7b276b3-8b85-4cfe-a39c-73da270336e3/volumes" Mar 21 03:50:42 crc kubenswrapper[4685]: I0321 03:50:42.543626 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8qq2k" event={"ID":"855b8c82-585a-4883-acdc-195377b480c2","Type":"ContainerStarted","Data":"f9ee7e5feba13eccb34afa5067ffbeb4ff13d78d67e6a5fafb290b1798c88b2b"} Mar 21 03:50:42 crc kubenswrapper[4685]: I0321 03:50:42.545163 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7swf2" event={"ID":"d4ebdf18-8426-42cc-93a6-60b46261aebe","Type":"ContainerStarted","Data":"e94326960a75656f36838b094d1f74215691d8fcf8da493d4debe7bda468be4d"} Mar 21 03:50:42 crc kubenswrapper[4685]: I0321 03:50:42.547259 4685 generic.go:334] "Generic (PLEG): container finished" podID="f0e0a768-d2ec-4986-8cc7-72f0bd1d285a" containerID="a3c984485cfb2991b4602fbfe5daaa8e39fbcbabd4aee25932bc4653a5affe76" exitCode=0 Mar 21 03:50:42 crc kubenswrapper[4685]: I0321 03:50:42.547336 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zrnqq" event={"ID":"f0e0a768-d2ec-4986-8cc7-72f0bd1d285a","Type":"ContainerDied","Data":"a3c984485cfb2991b4602fbfe5daaa8e39fbcbabd4aee25932bc4653a5affe76"} Mar 21 03:50:42 crc kubenswrapper[4685]: I0321 03:50:42.551609 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lnbh8" event={"ID":"4d6c412c-bc35-4360-91b0-06f8b60e7106","Type":"ContainerStarted","Data":"bfe98363c9199450c193ea8c4bdfc54b17679ec5de3d6d5dfb44d2eb7f6aa678"} Mar 21 03:50:42 crc kubenswrapper[4685]: I0321 03:50:42.563209 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8qq2k" podStartSLOduration=32.904773727 podStartE2EDuration="45.563190462s" podCreationTimestamp="2026-03-21 03:49:57 +0000 UTC" firstStartedPulling="2026-03-21 03:50:29.385639283 +0000 UTC m=+261.862708085" lastFinishedPulling="2026-03-21 03:50:42.044056028 +0000 UTC m=+274.521124820" observedRunningTime="2026-03-21 03:50:42.558595343 +0000 UTC m=+275.035664135" watchObservedRunningTime="2026-03-21 03:50:42.563190462 +0000 UTC m=+275.040259254" Mar 21 03:50:42 crc kubenswrapper[4685]: I0321 03:50:42.576366 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-lnbh8" podStartSLOduration=33.668909068 podStartE2EDuration="46.576349291s" podCreationTimestamp="2026-03-21 03:49:56 +0000 UTC" firstStartedPulling="2026-03-21 03:50:29.004378732 +0000 UTC m=+261.481447524" lastFinishedPulling="2026-03-21 03:50:41.911818955 +0000 UTC m=+274.388887747" observedRunningTime="2026-03-21 03:50:42.572820364 +0000 UTC m=+275.049889166" watchObservedRunningTime="2026-03-21 03:50:42.576349291 +0000 UTC m=+275.053418083" Mar 21 03:50:42 crc kubenswrapper[4685]: I0321 03:50:42.621896 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7swf2" podStartSLOduration=2.226027515 podStartE2EDuration="48.621873353s" podCreationTimestamp="2026-03-21 03:49:54 +0000 UTC" firstStartedPulling="2026-03-21 03:49:55.577909337 +0000 UTC m=+228.054978129" lastFinishedPulling="2026-03-21 03:50:41.973755175 +0000 UTC m=+274.450823967" observedRunningTime="2026-03-21 03:50:42.615492399 +0000 UTC m=+275.092561211" watchObservedRunningTime="2026-03-21 03:50:42.621873353 +0000 UTC m=+275.098942145" Mar 21 03:50:43 crc kubenswrapper[4685]: I0321 03:50:43.558419 4685 generic.go:334] "Generic (PLEG): container finished" podID="53ad7ff2-a7d3-4ad6-8e97-14542e4a99cd" containerID="da61e0b016b7a40b491b616c4845382c3d01209a10c6acc87a24c0afcf8d5428" exitCode=0 Mar 21 03:50:43 crc kubenswrapper[4685]: I0321 03:50:43.558456 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xvjs7" event={"ID":"53ad7ff2-a7d3-4ad6-8e97-14542e4a99cd","Type":"ContainerDied","Data":"da61e0b016b7a40b491b616c4845382c3d01209a10c6acc87a24c0afcf8d5428"} Mar 21 03:50:44 crc kubenswrapper[4685]: I0321 03:50:44.474159 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7swf2" Mar 21 03:50:44 crc kubenswrapper[4685]: I0321 03:50:44.474389 4685 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7swf2" Mar 21 03:50:44 crc kubenswrapper[4685]: I0321 03:50:44.518585 4685 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7swf2" Mar 21 03:50:44 crc kubenswrapper[4685]: I0321 03:50:44.567321 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zrnqq" event={"ID":"f0e0a768-d2ec-4986-8cc7-72f0bd1d285a","Type":"ContainerStarted","Data":"31deeee9e2e17aaa1f7fce67d10f5816041023ee773c3b8ac942aa018c0dcde1"} Mar 21 03:50:44 crc kubenswrapper[4685]: I0321 03:50:44.569026 4685 generic.go:334] "Generic (PLEG): container finished" podID="931ed0e7-7ffb-48ba-92b0-28883a6f0b39" containerID="0c95c902d01048b665b6b0be40dc6700543f1adc2471a719ebd801b93210b6dc" exitCode=0 Mar 21 03:50:44 crc kubenswrapper[4685]: I0321 03:50:44.569082 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mhc9s" event={"ID":"931ed0e7-7ffb-48ba-92b0-28883a6f0b39","Type":"ContainerDied","Data":"0c95c902d01048b665b6b0be40dc6700543f1adc2471a719ebd801b93210b6dc"} Mar 21 03:50:44 crc kubenswrapper[4685]: I0321 03:50:44.573335 4685 generic.go:334] "Generic (PLEG): container finished" podID="9c1f4e4f-a993-423e-8922-d8b81967d483" containerID="d1512e8f3dde8ccc1e35f43bb7da1f2aa36a9b112b74febd6ec320b3dc4cf06d" exitCode=0 Mar 21 03:50:44 crc kubenswrapper[4685]: I0321 03:50:44.573413 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s6khh" event={"ID":"9c1f4e4f-a993-423e-8922-d8b81967d483","Type":"ContainerDied","Data":"d1512e8f3dde8ccc1e35f43bb7da1f2aa36a9b112b74febd6ec320b3dc4cf06d"} Mar 21 03:50:45 crc kubenswrapper[4685]: I0321 03:50:45.590520 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mhc9s" event={"ID":"931ed0e7-7ffb-48ba-92b0-28883a6f0b39","Type":"ContainerStarted","Data":"2ce4d1c38370e6fe7108157ec26a4478d484b05528860fe73b8f7f736aadd4c6"} Mar 21 03:50:45 crc kubenswrapper[4685]: I0321 03:50:45.593695 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s6khh" event={"ID":"9c1f4e4f-a993-423e-8922-d8b81967d483","Type":"ContainerStarted","Data":"b550faf13202a2d7d77ba9b28afd1da552906f0f94ea8be2f95b70e5f93e8266"} Mar 21 03:50:45 crc kubenswrapper[4685]: I0321 03:50:45.603434 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xvjs7" event={"ID":"53ad7ff2-a7d3-4ad6-8e97-14542e4a99cd","Type":"ContainerStarted","Data":"4058faaca0183c0d4f767cb3a53404b38da72a2f4590534f2000518b56c87aa4"} Mar 21 03:50:45 crc kubenswrapper[4685]: I0321 03:50:45.608389 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mhc9s" podStartSLOduration=3.025115239 podStartE2EDuration="52.608368809s" podCreationTimestamp="2026-03-21 03:49:53 +0000 UTC" firstStartedPulling="2026-03-21 03:49:55.569565774 +0000 UTC m=+228.046634566" lastFinishedPulling="2026-03-21 03:50:45.152819344 +0000 UTC m=+277.629888136" observedRunningTime="2026-03-21 03:50:45.607793212 +0000 UTC m=+278.084862014" watchObservedRunningTime="2026-03-21 03:50:45.608368809 +0000 UTC m=+278.085437601" Mar 21 03:50:45 crc kubenswrapper[4685]: I0321 03:50:45.631811 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-s6khh" podStartSLOduration=7.275654072 podStartE2EDuration="50.63179322s" podCreationTimestamp="2026-03-21 03:49:55 +0000 UTC" firstStartedPulling="2026-03-21 03:50:01.818755426 +0000 UTC m=+234.295824218" lastFinishedPulling="2026-03-21 03:50:45.174894574 +0000 UTC m=+277.651963366" observedRunningTime="2026-03-21 03:50:45.630612504 +0000 UTC m=+278.107681296" watchObservedRunningTime="2026-03-21 03:50:45.63179322 +0000 UTC m=+278.108862012" Mar 21 03:50:45 crc kubenswrapper[4685]: I0321 03:50:45.666614 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xvjs7" podStartSLOduration=3.479495269 podStartE2EDuration="52.666596727s" podCreationTimestamp="2026-03-21 03:49:53 +0000 UTC" firstStartedPulling="2026-03-21 03:49:55.593368816 +0000 UTC m=+228.070437608" lastFinishedPulling="2026-03-21 03:50:44.780470274 +0000 UTC m=+277.257539066" observedRunningTime="2026-03-21 03:50:45.649977812 +0000 UTC m=+278.127046624" watchObservedRunningTime="2026-03-21 03:50:45.666596727 +0000 UTC m=+278.143665529" Mar 21 03:50:45 crc kubenswrapper[4685]: I0321 03:50:45.669442 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zrnqq" podStartSLOduration=7.596876379 podStartE2EDuration="49.669433063s" podCreationTimestamp="2026-03-21 03:49:56 +0000 UTC" firstStartedPulling="2026-03-21 03:50:01.818808037 +0000 UTC m=+234.295876829" lastFinishedPulling="2026-03-21 03:50:43.891364721 +0000 UTC m=+276.368433513" observedRunningTime="2026-03-21 03:50:45.665846494 +0000 UTC m=+278.142915286" watchObservedRunningTime="2026-03-21 03:50:45.669433063 +0000 UTC m=+278.146501855" Mar 21 03:50:46 crc kubenswrapper[4685]: I0321 03:50:46.300209 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-s6khh" Mar 21 03:50:46 crc kubenswrapper[4685]: I0321 03:50:46.308264 4685 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-s6khh" Mar 21 03:50:46 crc kubenswrapper[4685]: I0321 03:50:46.673160 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zrnqq" Mar 21 03:50:46 crc kubenswrapper[4685]: I0321 03:50:46.673209 4685 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zrnqq" Mar 21 03:50:46 crc kubenswrapper[4685]: I0321 03:50:46.716654 4685 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zrnqq" Mar 21 03:50:47 crc kubenswrapper[4685]: I0321 03:50:47.344556 4685 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-s6khh" podUID="9c1f4e4f-a993-423e-8922-d8b81967d483" containerName="registry-server" probeResult="failure" output=< Mar 21 03:50:47 crc kubenswrapper[4685]: timeout: failed to connect service ":50051" within 1s Mar 21 03:50:47 crc kubenswrapper[4685]: > Mar 21 03:50:47 crc kubenswrapper[4685]: I0321 03:50:47.353766 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-lnbh8" Mar 21 03:50:47 crc kubenswrapper[4685]: I0321 03:50:47.354055 4685 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-lnbh8" Mar 21 03:50:47 crc kubenswrapper[4685]: I0321 03:50:47.647041 4685 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8qq2k" Mar 21 03:50:47 crc kubenswrapper[4685]: I0321 03:50:47.647104 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8qq2k" Mar 21 03:50:48 crc kubenswrapper[4685]: I0321 03:50:48.391606 4685 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-lnbh8" podUID="4d6c412c-bc35-4360-91b0-06f8b60e7106" containerName="registry-server" probeResult="failure" output=< Mar 21 03:50:48 crc kubenswrapper[4685]: timeout: failed to connect service ":50051" within 1s Mar 21 03:50:48 crc kubenswrapper[4685]: > Mar 21 03:50:48 crc kubenswrapper[4685]: I0321 03:50:48.701250 4685 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8qq2k" podUID="855b8c82-585a-4883-acdc-195377b480c2" containerName="registry-server" probeResult="failure" output=< Mar 21 03:50:48 crc kubenswrapper[4685]: timeout: failed to connect service ":50051" within 1s Mar 21 03:50:48 crc kubenswrapper[4685]: > Mar 21 03:50:52 crc kubenswrapper[4685]: I0321 03:50:52.568276 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-866b8556c4-tvtvg"] Mar 21 03:50:52 crc kubenswrapper[4685]: I0321 03:50:52.569122 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-866b8556c4-tvtvg" podUID="d5321225-ce62-44de-bb49-821ec1823946" containerName="controller-manager" containerID="cri-o://5ca9a11f9078ce21f842573c019b6109d22d8ea2b1d0f79dbf0587ae599bb87c" gracePeriod=30 Mar 21 03:50:52 crc kubenswrapper[4685]: I0321 03:50:52.589567 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-856b8c6d67-q7b7q"] Mar 21 03:50:52 crc kubenswrapper[4685]: I0321 03:50:52.589785 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-856b8c6d67-q7b7q" podUID="991e3873-a3a7-4195-9323-606a14313a5a" containerName="route-controller-manager" containerID="cri-o://4795f906f8314a5957ff1c3e1c0fa30773d974eca6cf3eaac417cb7b2d13d120" gracePeriod=30 Mar 21 03:50:53 crc kubenswrapper[4685]: I0321 03:50:53.666296 4685 generic.go:334] "Generic (PLEG): container finished" podID="991e3873-a3a7-4195-9323-606a14313a5a" containerID="4795f906f8314a5957ff1c3e1c0fa30773d974eca6cf3eaac417cb7b2d13d120" exitCode=0 Mar 21 03:50:53 crc kubenswrapper[4685]: I0321 03:50:53.666398 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-856b8c6d67-q7b7q" event={"ID":"991e3873-a3a7-4195-9323-606a14313a5a","Type":"ContainerDied","Data":"4795f906f8314a5957ff1c3e1c0fa30773d974eca6cf3eaac417cb7b2d13d120"} Mar 21 03:50:53 crc kubenswrapper[4685]: I0321 03:50:53.669198 4685 generic.go:334] "Generic (PLEG): container finished" podID="d5321225-ce62-44de-bb49-821ec1823946" containerID="5ca9a11f9078ce21f842573c019b6109d22d8ea2b1d0f79dbf0587ae599bb87c" exitCode=0 Mar 21 03:50:53 crc kubenswrapper[4685]: I0321 03:50:53.669250 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-866b8556c4-tvtvg" event={"ID":"d5321225-ce62-44de-bb49-821ec1823946","Type":"ContainerDied","Data":"5ca9a11f9078ce21f842573c019b6109d22d8ea2b1d0f79dbf0587ae599bb87c"} Mar 21 03:50:53 crc kubenswrapper[4685]: I0321 03:50:53.921320 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-856b8c6d67-q7b7q" Mar 21 03:50:53 crc kubenswrapper[4685]: I0321 03:50:53.959251 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5bc69c9d55-7trxv"] Mar 21 03:50:53 crc kubenswrapper[4685]: E0321 03:50:53.959441 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7b276b3-8b85-4cfe-a39c-73da270336e3" containerName="extract-utilities" Mar 21 03:50:53 crc kubenswrapper[4685]: I0321 03:50:53.959452 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7b276b3-8b85-4cfe-a39c-73da270336e3" containerName="extract-utilities" Mar 21 03:50:53 crc kubenswrapper[4685]: E0321 03:50:53.959483 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7b276b3-8b85-4cfe-a39c-73da270336e3" containerName="registry-server" Mar 21 03:50:53 crc kubenswrapper[4685]: I0321 03:50:53.959491 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7b276b3-8b85-4cfe-a39c-73da270336e3" containerName="registry-server" Mar 21 03:50:53 crc kubenswrapper[4685]: E0321 03:50:53.959500 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7b276b3-8b85-4cfe-a39c-73da270336e3" containerName="extract-content" Mar 21 03:50:53 crc kubenswrapper[4685]: I0321 03:50:53.959505 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7b276b3-8b85-4cfe-a39c-73da270336e3" containerName="extract-content" Mar 21 03:50:53 crc kubenswrapper[4685]: E0321 03:50:53.959513 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="991e3873-a3a7-4195-9323-606a14313a5a" containerName="route-controller-manager" Mar 21 03:50:53 crc kubenswrapper[4685]: I0321 03:50:53.959518 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="991e3873-a3a7-4195-9323-606a14313a5a" containerName="route-controller-manager" Mar 21 03:50:53 crc kubenswrapper[4685]: I0321 03:50:53.959618 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7b276b3-8b85-4cfe-a39c-73da270336e3" containerName="registry-server" Mar 21 03:50:53 crc kubenswrapper[4685]: I0321 03:50:53.959626 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="991e3873-a3a7-4195-9323-606a14313a5a" containerName="route-controller-manager" Mar 21 03:50:53 crc kubenswrapper[4685]: I0321 03:50:53.960020 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5bc69c9d55-7trxv" Mar 21 03:50:53 crc kubenswrapper[4685]: I0321 03:50:53.967269 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5bc69c9d55-7trxv"] Mar 21 03:50:54 crc kubenswrapper[4685]: I0321 03:50:54.041182 4685 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mhc9s" Mar 21 03:50:54 crc kubenswrapper[4685]: I0321 03:50:54.042944 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mhc9s" Mar 21 03:50:54 crc kubenswrapper[4685]: I0321 03:50:54.047331 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/991e3873-a3a7-4195-9323-606a14313a5a-serving-cert\") pod \"991e3873-a3a7-4195-9323-606a14313a5a\" (UID: \"991e3873-a3a7-4195-9323-606a14313a5a\") " Mar 21 03:50:54 crc kubenswrapper[4685]: I0321 03:50:54.047384 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/991e3873-a3a7-4195-9323-606a14313a5a-client-ca\") pod \"991e3873-a3a7-4195-9323-606a14313a5a\" (UID: \"991e3873-a3a7-4195-9323-606a14313a5a\") " Mar 21 03:50:54 crc kubenswrapper[4685]: I0321 03:50:54.047517 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kg5nc\" (UniqueName: \"kubernetes.io/projected/991e3873-a3a7-4195-9323-606a14313a5a-kube-api-access-kg5nc\") pod \"991e3873-a3a7-4195-9323-606a14313a5a\" (UID: \"991e3873-a3a7-4195-9323-606a14313a5a\") " Mar 21 03:50:54 crc kubenswrapper[4685]: I0321 03:50:54.047564 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/991e3873-a3a7-4195-9323-606a14313a5a-config\") pod \"991e3873-a3a7-4195-9323-606a14313a5a\" (UID: \"991e3873-a3a7-4195-9323-606a14313a5a\") " Mar 21 03:50:54 crc kubenswrapper[4685]: I0321 03:50:54.048587 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/991e3873-a3a7-4195-9323-606a14313a5a-client-ca" (OuterVolumeSpecName: "client-ca") pod "991e3873-a3a7-4195-9323-606a14313a5a" (UID: "991e3873-a3a7-4195-9323-606a14313a5a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:50:54 crc kubenswrapper[4685]: I0321 03:50:54.048696 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/991e3873-a3a7-4195-9323-606a14313a5a-config" (OuterVolumeSpecName: "config") pod "991e3873-a3a7-4195-9323-606a14313a5a" (UID: "991e3873-a3a7-4195-9323-606a14313a5a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:50:54 crc kubenswrapper[4685]: I0321 03:50:54.054351 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/991e3873-a3a7-4195-9323-606a14313a5a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "991e3873-a3a7-4195-9323-606a14313a5a" (UID: "991e3873-a3a7-4195-9323-606a14313a5a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 03:50:54 crc kubenswrapper[4685]: I0321 03:50:54.055705 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/991e3873-a3a7-4195-9323-606a14313a5a-kube-api-access-kg5nc" (OuterVolumeSpecName: "kube-api-access-kg5nc") pod "991e3873-a3a7-4195-9323-606a14313a5a" (UID: "991e3873-a3a7-4195-9323-606a14313a5a"). InnerVolumeSpecName "kube-api-access-kg5nc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:50:54 crc kubenswrapper[4685]: I0321 03:50:54.088024 4685 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mhc9s" Mar 21 03:50:54 crc kubenswrapper[4685]: I0321 03:50:54.137561 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-866b8556c4-tvtvg" Mar 21 03:50:54 crc kubenswrapper[4685]: I0321 03:50:54.148528 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fwq2\" (UniqueName: \"kubernetes.io/projected/de82f429-1aa8-465f-b944-2bfb17d7d26b-kube-api-access-9fwq2\") pod \"route-controller-manager-5bc69c9d55-7trxv\" (UID: \"de82f429-1aa8-465f-b944-2bfb17d7d26b\") " pod="openshift-route-controller-manager/route-controller-manager-5bc69c9d55-7trxv" Mar 21 03:50:54 crc kubenswrapper[4685]: I0321 03:50:54.148571 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de82f429-1aa8-465f-b944-2bfb17d7d26b-config\") pod \"route-controller-manager-5bc69c9d55-7trxv\" (UID: \"de82f429-1aa8-465f-b944-2bfb17d7d26b\") " pod="openshift-route-controller-manager/route-controller-manager-5bc69c9d55-7trxv" Mar 21 03:50:54 crc kubenswrapper[4685]: I0321 03:50:54.148600 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/de82f429-1aa8-465f-b944-2bfb17d7d26b-client-ca\") pod \"route-controller-manager-5bc69c9d55-7trxv\" (UID: \"de82f429-1aa8-465f-b944-2bfb17d7d26b\") " pod="openshift-route-controller-manager/route-controller-manager-5bc69c9d55-7trxv" Mar 21 03:50:54 crc kubenswrapper[4685]: I0321 03:50:54.148666 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/de82f429-1aa8-465f-b944-2bfb17d7d26b-serving-cert\") pod \"route-controller-manager-5bc69c9d55-7trxv\" (UID: \"de82f429-1aa8-465f-b944-2bfb17d7d26b\") " pod="openshift-route-controller-manager/route-controller-manager-5bc69c9d55-7trxv" Mar 21 03:50:54 crc kubenswrapper[4685]: I0321 03:50:54.148730 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kg5nc\" (UniqueName: \"kubernetes.io/projected/991e3873-a3a7-4195-9323-606a14313a5a-kube-api-access-kg5nc\") on node \"crc\" DevicePath \"\"" Mar 21 03:50:54 crc kubenswrapper[4685]: I0321 03:50:54.148744 4685 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/991e3873-a3a7-4195-9323-606a14313a5a-config\") on node \"crc\" DevicePath \"\"" Mar 21 03:50:54 crc kubenswrapper[4685]: I0321 03:50:54.148752 4685 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/991e3873-a3a7-4195-9323-606a14313a5a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 03:50:54 crc kubenswrapper[4685]: I0321 03:50:54.148762 4685 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/991e3873-a3a7-4195-9323-606a14313a5a-client-ca\") on node \"crc\" DevicePath \"\"" Mar 21 03:50:54 crc kubenswrapper[4685]: I0321 03:50:54.238305 4685 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xvjs7" Mar 21 03:50:54 crc kubenswrapper[4685]: I0321 03:50:54.238359 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xvjs7" Mar 21 03:50:54 crc kubenswrapper[4685]: I0321 03:50:54.250103 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d5321225-ce62-44de-bb49-821ec1823946-proxy-ca-bundles\") pod \"d5321225-ce62-44de-bb49-821ec1823946\" (UID: \"d5321225-ce62-44de-bb49-821ec1823946\") " Mar 21 03:50:54 crc kubenswrapper[4685]: I0321 03:50:54.250159 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d5321225-ce62-44de-bb49-821ec1823946-serving-cert\") pod \"d5321225-ce62-44de-bb49-821ec1823946\" (UID: \"d5321225-ce62-44de-bb49-821ec1823946\") " Mar 21 03:50:54 crc kubenswrapper[4685]: I0321 03:50:54.250190 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fgvfx\" (UniqueName: \"kubernetes.io/projected/d5321225-ce62-44de-bb49-821ec1823946-kube-api-access-fgvfx\") pod \"d5321225-ce62-44de-bb49-821ec1823946\" (UID: \"d5321225-ce62-44de-bb49-821ec1823946\") " Mar 21 03:50:54 crc kubenswrapper[4685]: I0321 03:50:54.250306 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d5321225-ce62-44de-bb49-821ec1823946-client-ca\") pod \"d5321225-ce62-44de-bb49-821ec1823946\" (UID: \"d5321225-ce62-44de-bb49-821ec1823946\") " Mar 21 03:50:54 crc kubenswrapper[4685]: I0321 03:50:54.250407 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5321225-ce62-44de-bb49-821ec1823946-config\") pod \"d5321225-ce62-44de-bb49-821ec1823946\" (UID: \"d5321225-ce62-44de-bb49-821ec1823946\") " Mar 21 03:50:54 crc kubenswrapper[4685]: I0321 03:50:54.250609 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/de82f429-1aa8-465f-b944-2bfb17d7d26b-serving-cert\") pod \"route-controller-manager-5bc69c9d55-7trxv\" (UID: \"de82f429-1aa8-465f-b944-2bfb17d7d26b\") " pod="openshift-route-controller-manager/route-controller-manager-5bc69c9d55-7trxv" Mar 21 03:50:54 crc kubenswrapper[4685]: I0321 03:50:54.250705 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fwq2\" (UniqueName: \"kubernetes.io/projected/de82f429-1aa8-465f-b944-2bfb17d7d26b-kube-api-access-9fwq2\") pod \"route-controller-manager-5bc69c9d55-7trxv\" (UID: \"de82f429-1aa8-465f-b944-2bfb17d7d26b\") " pod="openshift-route-controller-manager/route-controller-manager-5bc69c9d55-7trxv" Mar 21 03:50:54 crc kubenswrapper[4685]: I0321 03:50:54.250733 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de82f429-1aa8-465f-b944-2bfb17d7d26b-config\") pod \"route-controller-manager-5bc69c9d55-7trxv\" (UID: \"de82f429-1aa8-465f-b944-2bfb17d7d26b\") " pod="openshift-route-controller-manager/route-controller-manager-5bc69c9d55-7trxv" Mar 21 03:50:54 crc kubenswrapper[4685]: I0321 03:50:54.250769 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/de82f429-1aa8-465f-b944-2bfb17d7d26b-client-ca\") pod \"route-controller-manager-5bc69c9d55-7trxv\" (UID: \"de82f429-1aa8-465f-b944-2bfb17d7d26b\") " pod="openshift-route-controller-manager/route-controller-manager-5bc69c9d55-7trxv" Mar 21 03:50:54 crc kubenswrapper[4685]: I0321 03:50:54.250955 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5321225-ce62-44de-bb49-821ec1823946-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "d5321225-ce62-44de-bb49-821ec1823946" (UID: "d5321225-ce62-44de-bb49-821ec1823946"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:50:54 crc kubenswrapper[4685]: I0321 03:50:54.251148 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5321225-ce62-44de-bb49-821ec1823946-client-ca" (OuterVolumeSpecName: "client-ca") pod "d5321225-ce62-44de-bb49-821ec1823946" (UID: "d5321225-ce62-44de-bb49-821ec1823946"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:50:54 crc kubenswrapper[4685]: I0321 03:50:54.251542 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5321225-ce62-44de-bb49-821ec1823946-config" (OuterVolumeSpecName: "config") pod "d5321225-ce62-44de-bb49-821ec1823946" (UID: "d5321225-ce62-44de-bb49-821ec1823946"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:50:54 crc kubenswrapper[4685]: I0321 03:50:54.251994 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/de82f429-1aa8-465f-b944-2bfb17d7d26b-client-ca\") pod \"route-controller-manager-5bc69c9d55-7trxv\" (UID: \"de82f429-1aa8-465f-b944-2bfb17d7d26b\") " pod="openshift-route-controller-manager/route-controller-manager-5bc69c9d55-7trxv" Mar 21 03:50:54 crc kubenswrapper[4685]: I0321 03:50:54.252034 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de82f429-1aa8-465f-b944-2bfb17d7d26b-config\") pod \"route-controller-manager-5bc69c9d55-7trxv\" (UID: \"de82f429-1aa8-465f-b944-2bfb17d7d26b\") " pod="openshift-route-controller-manager/route-controller-manager-5bc69c9d55-7trxv" Mar 21 03:50:54 crc kubenswrapper[4685]: I0321 03:50:54.254168 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5321225-ce62-44de-bb49-821ec1823946-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d5321225-ce62-44de-bb49-821ec1823946" (UID: "d5321225-ce62-44de-bb49-821ec1823946"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 03:50:54 crc kubenswrapper[4685]: I0321 03:50:54.254726 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5321225-ce62-44de-bb49-821ec1823946-kube-api-access-fgvfx" (OuterVolumeSpecName: "kube-api-access-fgvfx") pod "d5321225-ce62-44de-bb49-821ec1823946" (UID: "d5321225-ce62-44de-bb49-821ec1823946"). InnerVolumeSpecName "kube-api-access-fgvfx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:50:54 crc kubenswrapper[4685]: I0321 03:50:54.255011 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/de82f429-1aa8-465f-b944-2bfb17d7d26b-serving-cert\") pod \"route-controller-manager-5bc69c9d55-7trxv\" (UID: \"de82f429-1aa8-465f-b944-2bfb17d7d26b\") " pod="openshift-route-controller-manager/route-controller-manager-5bc69c9d55-7trxv" Mar 21 03:50:54 crc kubenswrapper[4685]: I0321 03:50:54.274601 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fwq2\" (UniqueName: \"kubernetes.io/projected/de82f429-1aa8-465f-b944-2bfb17d7d26b-kube-api-access-9fwq2\") pod \"route-controller-manager-5bc69c9d55-7trxv\" (UID: \"de82f429-1aa8-465f-b944-2bfb17d7d26b\") " pod="openshift-route-controller-manager/route-controller-manager-5bc69c9d55-7trxv" Mar 21 03:50:54 crc kubenswrapper[4685]: I0321 03:50:54.278399 4685 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xvjs7" Mar 21 03:50:54 crc kubenswrapper[4685]: I0321 03:50:54.286048 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5bc69c9d55-7trxv" Mar 21 03:50:54 crc kubenswrapper[4685]: I0321 03:50:54.352185 4685 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5321225-ce62-44de-bb49-821ec1823946-config\") on node \"crc\" DevicePath \"\"" Mar 21 03:50:54 crc kubenswrapper[4685]: I0321 03:50:54.352216 4685 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d5321225-ce62-44de-bb49-821ec1823946-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 21 03:50:54 crc kubenswrapper[4685]: I0321 03:50:54.352227 4685 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d5321225-ce62-44de-bb49-821ec1823946-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 03:50:54 crc kubenswrapper[4685]: I0321 03:50:54.352238 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fgvfx\" (UniqueName: \"kubernetes.io/projected/d5321225-ce62-44de-bb49-821ec1823946-kube-api-access-fgvfx\") on node \"crc\" DevicePath \"\"" Mar 21 03:50:54 crc kubenswrapper[4685]: I0321 03:50:54.352247 4685 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d5321225-ce62-44de-bb49-821ec1823946-client-ca\") on node \"crc\" DevicePath \"\"" Mar 21 03:50:54 crc kubenswrapper[4685]: I0321 03:50:54.524477 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7swf2" Mar 21 03:50:54 crc kubenswrapper[4685]: I0321 03:50:54.675953 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-856b8c6d67-q7b7q" Mar 21 03:50:54 crc kubenswrapper[4685]: I0321 03:50:54.675995 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-856b8c6d67-q7b7q" event={"ID":"991e3873-a3a7-4195-9323-606a14313a5a","Type":"ContainerDied","Data":"d8b74c61366b8e8880939f713344df6c68d93acc279a6eb4863c1678477491c8"} Mar 21 03:50:54 crc kubenswrapper[4685]: I0321 03:50:54.676062 4685 scope.go:117] "RemoveContainer" containerID="4795f906f8314a5957ff1c3e1c0fa30773d974eca6cf3eaac417cb7b2d13d120" Mar 21 03:50:54 crc kubenswrapper[4685]: I0321 03:50:54.678911 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-866b8556c4-tvtvg" event={"ID":"d5321225-ce62-44de-bb49-821ec1823946","Type":"ContainerDied","Data":"524c2e261280c095be9ac5d50861c9e472d482d058241042672e9fdf696e7359"} Mar 21 03:50:54 crc kubenswrapper[4685]: I0321 03:50:54.679033 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-866b8556c4-tvtvg" Mar 21 03:50:54 crc kubenswrapper[4685]: I0321 03:50:54.705900 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-856b8c6d67-q7b7q"] Mar 21 03:50:54 crc kubenswrapper[4685]: I0321 03:50:54.711380 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-856b8c6d67-q7b7q"] Mar 21 03:50:54 crc kubenswrapper[4685]: I0321 03:50:54.713217 4685 scope.go:117] "RemoveContainer" containerID="5ca9a11f9078ce21f842573c019b6109d22d8ea2b1d0f79dbf0587ae599bb87c" Mar 21 03:50:54 crc kubenswrapper[4685]: I0321 03:50:54.722683 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mhc9s" Mar 21 03:50:54 crc kubenswrapper[4685]: I0321 03:50:54.722934 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5bc69c9d55-7trxv"] Mar 21 03:50:54 crc kubenswrapper[4685]: I0321 03:50:54.727746 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-866b8556c4-tvtvg"] Mar 21 03:50:54 crc kubenswrapper[4685]: I0321 03:50:54.731552 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-866b8556c4-tvtvg"] Mar 21 03:50:54 crc kubenswrapper[4685]: W0321 03:50:54.734398 4685 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde82f429_1aa8_465f_b944_2bfb17d7d26b.slice/crio-08608c8d3f66aa37fcbc6a66679212d0d22022972aa9c737ca6d721ee36f9883 WatchSource:0}: Error finding container 08608c8d3f66aa37fcbc6a66679212d0d22022972aa9c737ca6d721ee36f9883: Status 404 returned error can't find the container with id 08608c8d3f66aa37fcbc6a66679212d0d22022972aa9c737ca6d721ee36f9883 Mar 21 03:50:54 crc kubenswrapper[4685]: I0321 03:50:54.757203 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xvjs7" Mar 21 03:50:55 crc kubenswrapper[4685]: I0321 03:50:55.687730 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5bc69c9d55-7trxv" event={"ID":"de82f429-1aa8-465f-b944-2bfb17d7d26b","Type":"ContainerStarted","Data":"34cb40a6d28fd95aa004fa6ca3edfc26871731c6adc4c21264ccaa096616a292"} Mar 21 03:50:55 crc kubenswrapper[4685]: I0321 03:50:55.687778 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5bc69c9d55-7trxv" event={"ID":"de82f429-1aa8-465f-b944-2bfb17d7d26b","Type":"ContainerStarted","Data":"08608c8d3f66aa37fcbc6a66679212d0d22022972aa9c737ca6d721ee36f9883"} Mar 21 03:50:55 crc kubenswrapper[4685]: I0321 03:50:55.688541 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5bc69c9d55-7trxv" Mar 21 03:50:55 crc kubenswrapper[4685]: I0321 03:50:55.717388 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5bc69c9d55-7trxv" podStartSLOduration=3.717364452 podStartE2EDuration="3.717364452s" podCreationTimestamp="2026-03-21 03:50:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 03:50:55.703008546 +0000 UTC m=+288.180077368" watchObservedRunningTime="2026-03-21 03:50:55.717364452 +0000 UTC m=+288.194433244" Mar 21 03:50:55 crc kubenswrapper[4685]: I0321 03:50:55.857687 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5bc69c9d55-7trxv" Mar 21 03:50:56 crc kubenswrapper[4685]: I0321 03:50:56.310204 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="991e3873-a3a7-4195-9323-606a14313a5a" path="/var/lib/kubelet/pods/991e3873-a3a7-4195-9323-606a14313a5a/volumes" Mar 21 03:50:56 crc kubenswrapper[4685]: I0321 03:50:56.311123 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5321225-ce62-44de-bb49-821ec1823946" path="/var/lib/kubelet/pods/d5321225-ce62-44de-bb49-821ec1823946/volumes" Mar 21 03:50:56 crc kubenswrapper[4685]: I0321 03:50:56.358648 4685 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-s6khh" Mar 21 03:50:56 crc kubenswrapper[4685]: I0321 03:50:56.411187 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-s6khh" Mar 21 03:50:56 crc kubenswrapper[4685]: I0321 03:50:56.518504 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-75994f449d-jj86b"] Mar 21 03:50:56 crc kubenswrapper[4685]: E0321 03:50:56.518737 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5321225-ce62-44de-bb49-821ec1823946" containerName="controller-manager" Mar 21 03:50:56 crc kubenswrapper[4685]: I0321 03:50:56.518750 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5321225-ce62-44de-bb49-821ec1823946" containerName="controller-manager" Mar 21 03:50:56 crc kubenswrapper[4685]: I0321 03:50:56.518913 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5321225-ce62-44de-bb49-821ec1823946" containerName="controller-manager" Mar 21 03:50:56 crc kubenswrapper[4685]: I0321 03:50:56.519335 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-75994f449d-jj86b" Mar 21 03:50:56 crc kubenswrapper[4685]: I0321 03:50:56.521152 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 21 03:50:56 crc kubenswrapper[4685]: I0321 03:50:56.522731 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 21 03:50:56 crc kubenswrapper[4685]: I0321 03:50:56.522777 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 21 03:50:56 crc kubenswrapper[4685]: I0321 03:50:56.528254 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 21 03:50:56 crc kubenswrapper[4685]: I0321 03:50:56.528264 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 21 03:50:56 crc kubenswrapper[4685]: I0321 03:50:56.528334 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 21 03:50:56 crc kubenswrapper[4685]: I0321 03:50:56.528277 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 21 03:50:56 crc kubenswrapper[4685]: I0321 03:50:56.542901 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-75994f449d-jj86b"] Mar 21 03:50:56 crc kubenswrapper[4685]: I0321 03:50:56.680918 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ede8a25b-fb4b-4cbc-914d-1fc24155b8f1-serving-cert\") pod \"controller-manager-75994f449d-jj86b\" (UID: \"ede8a25b-fb4b-4cbc-914d-1fc24155b8f1\") " pod="openshift-controller-manager/controller-manager-75994f449d-jj86b" Mar 21 03:50:56 crc kubenswrapper[4685]: I0321 03:50:56.680982 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ede8a25b-fb4b-4cbc-914d-1fc24155b8f1-proxy-ca-bundles\") pod \"controller-manager-75994f449d-jj86b\" (UID: \"ede8a25b-fb4b-4cbc-914d-1fc24155b8f1\") " pod="openshift-controller-manager/controller-manager-75994f449d-jj86b" Mar 21 03:50:56 crc kubenswrapper[4685]: I0321 03:50:56.681012 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ede8a25b-fb4b-4cbc-914d-1fc24155b8f1-client-ca\") pod \"controller-manager-75994f449d-jj86b\" (UID: \"ede8a25b-fb4b-4cbc-914d-1fc24155b8f1\") " pod="openshift-controller-manager/controller-manager-75994f449d-jj86b" Mar 21 03:50:56 crc kubenswrapper[4685]: I0321 03:50:56.681046 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ede8a25b-fb4b-4cbc-914d-1fc24155b8f1-config\") pod \"controller-manager-75994f449d-jj86b\" (UID: \"ede8a25b-fb4b-4cbc-914d-1fc24155b8f1\") " pod="openshift-controller-manager/controller-manager-75994f449d-jj86b" Mar 21 03:50:56 crc kubenswrapper[4685]: I0321 03:50:56.681074 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r959n\" (UniqueName: \"kubernetes.io/projected/ede8a25b-fb4b-4cbc-914d-1fc24155b8f1-kube-api-access-r959n\") pod \"controller-manager-75994f449d-jj86b\" (UID: \"ede8a25b-fb4b-4cbc-914d-1fc24155b8f1\") " pod="openshift-controller-manager/controller-manager-75994f449d-jj86b" Mar 21 03:50:56 crc kubenswrapper[4685]: I0321 03:50:56.718455 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zrnqq" Mar 21 03:50:56 crc kubenswrapper[4685]: I0321 03:50:56.781722 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ede8a25b-fb4b-4cbc-914d-1fc24155b8f1-serving-cert\") pod \"controller-manager-75994f449d-jj86b\" (UID: \"ede8a25b-fb4b-4cbc-914d-1fc24155b8f1\") " pod="openshift-controller-manager/controller-manager-75994f449d-jj86b" Mar 21 03:50:56 crc kubenswrapper[4685]: I0321 03:50:56.781785 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ede8a25b-fb4b-4cbc-914d-1fc24155b8f1-proxy-ca-bundles\") pod \"controller-manager-75994f449d-jj86b\" (UID: \"ede8a25b-fb4b-4cbc-914d-1fc24155b8f1\") " pod="openshift-controller-manager/controller-manager-75994f449d-jj86b" Mar 21 03:50:56 crc kubenswrapper[4685]: I0321 03:50:56.781823 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ede8a25b-fb4b-4cbc-914d-1fc24155b8f1-client-ca\") pod \"controller-manager-75994f449d-jj86b\" (UID: \"ede8a25b-fb4b-4cbc-914d-1fc24155b8f1\") " pod="openshift-controller-manager/controller-manager-75994f449d-jj86b" Mar 21 03:50:56 crc kubenswrapper[4685]: I0321 03:50:56.781882 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ede8a25b-fb4b-4cbc-914d-1fc24155b8f1-config\") pod \"controller-manager-75994f449d-jj86b\" (UID: \"ede8a25b-fb4b-4cbc-914d-1fc24155b8f1\") " pod="openshift-controller-manager/controller-manager-75994f449d-jj86b" Mar 21 03:50:56 crc kubenswrapper[4685]: I0321 03:50:56.781906 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r959n\" (UniqueName: \"kubernetes.io/projected/ede8a25b-fb4b-4cbc-914d-1fc24155b8f1-kube-api-access-r959n\") pod \"controller-manager-75994f449d-jj86b\" (UID: \"ede8a25b-fb4b-4cbc-914d-1fc24155b8f1\") " pod="openshift-controller-manager/controller-manager-75994f449d-jj86b" Mar 21 03:50:56 crc kubenswrapper[4685]: I0321 03:50:56.782950 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ede8a25b-fb4b-4cbc-914d-1fc24155b8f1-client-ca\") pod \"controller-manager-75994f449d-jj86b\" (UID: \"ede8a25b-fb4b-4cbc-914d-1fc24155b8f1\") " pod="openshift-controller-manager/controller-manager-75994f449d-jj86b" Mar 21 03:50:56 crc kubenswrapper[4685]: I0321 03:50:56.783627 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ede8a25b-fb4b-4cbc-914d-1fc24155b8f1-config\") pod \"controller-manager-75994f449d-jj86b\" (UID: \"ede8a25b-fb4b-4cbc-914d-1fc24155b8f1\") " pod="openshift-controller-manager/controller-manager-75994f449d-jj86b" Mar 21 03:50:56 crc kubenswrapper[4685]: I0321 03:50:56.784522 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ede8a25b-fb4b-4cbc-914d-1fc24155b8f1-proxy-ca-bundles\") pod \"controller-manager-75994f449d-jj86b\" (UID: \"ede8a25b-fb4b-4cbc-914d-1fc24155b8f1\") " pod="openshift-controller-manager/controller-manager-75994f449d-jj86b" Mar 21 03:50:56 crc kubenswrapper[4685]: I0321 03:50:56.787735 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ede8a25b-fb4b-4cbc-914d-1fc24155b8f1-serving-cert\") pod \"controller-manager-75994f449d-jj86b\" (UID: \"ede8a25b-fb4b-4cbc-914d-1fc24155b8f1\") " pod="openshift-controller-manager/controller-manager-75994f449d-jj86b" Mar 21 03:50:56 crc kubenswrapper[4685]: I0321 03:50:56.799304 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r959n\" (UniqueName: \"kubernetes.io/projected/ede8a25b-fb4b-4cbc-914d-1fc24155b8f1-kube-api-access-r959n\") pod \"controller-manager-75994f449d-jj86b\" (UID: \"ede8a25b-fb4b-4cbc-914d-1fc24155b8f1\") " pod="openshift-controller-manager/controller-manager-75994f449d-jj86b" Mar 21 03:50:56 crc kubenswrapper[4685]: I0321 03:50:56.835024 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-75994f449d-jj86b" Mar 21 03:50:56 crc kubenswrapper[4685]: I0321 03:50:56.927143 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7swf2"] Mar 21 03:50:56 crc kubenswrapper[4685]: I0321 03:50:56.927566 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7swf2" podUID="d4ebdf18-8426-42cc-93a6-60b46261aebe" containerName="registry-server" containerID="cri-o://e94326960a75656f36838b094d1f74215691d8fcf8da493d4debe7bda468be4d" gracePeriod=2 Mar 21 03:50:57 crc kubenswrapper[4685]: I0321 03:50:57.242056 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-75994f449d-jj86b"] Mar 21 03:50:57 crc kubenswrapper[4685]: W0321 03:50:57.248615 4685 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podede8a25b_fb4b_4cbc_914d_1fc24155b8f1.slice/crio-d2fe2e58c472834e3ddfc65aaef49ca0fdc3eb17e4fc2e024cbf1249f7897d52 WatchSource:0}: Error finding container d2fe2e58c472834e3ddfc65aaef49ca0fdc3eb17e4fc2e024cbf1249f7897d52: Status 404 returned error can't find the container with id d2fe2e58c472834e3ddfc65aaef49ca0fdc3eb17e4fc2e024cbf1249f7897d52 Mar 21 03:50:57 crc kubenswrapper[4685]: I0321 03:50:57.274958 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7swf2" Mar 21 03:50:57 crc kubenswrapper[4685]: I0321 03:50:57.393662 4685 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-lnbh8" Mar 21 03:50:57 crc kubenswrapper[4685]: I0321 03:50:57.395262 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4ebdf18-8426-42cc-93a6-60b46261aebe-catalog-content\") pod \"d4ebdf18-8426-42cc-93a6-60b46261aebe\" (UID: \"d4ebdf18-8426-42cc-93a6-60b46261aebe\") " Mar 21 03:50:57 crc kubenswrapper[4685]: I0321 03:50:57.395369 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4ebdf18-8426-42cc-93a6-60b46261aebe-utilities\") pod \"d4ebdf18-8426-42cc-93a6-60b46261aebe\" (UID: \"d4ebdf18-8426-42cc-93a6-60b46261aebe\") " Mar 21 03:50:57 crc kubenswrapper[4685]: I0321 03:50:57.395396 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zktx\" (UniqueName: \"kubernetes.io/projected/d4ebdf18-8426-42cc-93a6-60b46261aebe-kube-api-access-5zktx\") pod \"d4ebdf18-8426-42cc-93a6-60b46261aebe\" (UID: \"d4ebdf18-8426-42cc-93a6-60b46261aebe\") " Mar 21 03:50:57 crc kubenswrapper[4685]: I0321 03:50:57.396484 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4ebdf18-8426-42cc-93a6-60b46261aebe-utilities" (OuterVolumeSpecName: "utilities") pod "d4ebdf18-8426-42cc-93a6-60b46261aebe" (UID: "d4ebdf18-8426-42cc-93a6-60b46261aebe"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 03:50:57 crc kubenswrapper[4685]: I0321 03:50:57.401069 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4ebdf18-8426-42cc-93a6-60b46261aebe-kube-api-access-5zktx" (OuterVolumeSpecName: "kube-api-access-5zktx") pod "d4ebdf18-8426-42cc-93a6-60b46261aebe" (UID: "d4ebdf18-8426-42cc-93a6-60b46261aebe"). InnerVolumeSpecName "kube-api-access-5zktx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:50:57 crc kubenswrapper[4685]: I0321 03:50:57.439539 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-lnbh8" Mar 21 03:50:57 crc kubenswrapper[4685]: I0321 03:50:57.450135 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4ebdf18-8426-42cc-93a6-60b46261aebe-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d4ebdf18-8426-42cc-93a6-60b46261aebe" (UID: "d4ebdf18-8426-42cc-93a6-60b46261aebe"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 03:50:57 crc kubenswrapper[4685]: I0321 03:50:57.496631 4685 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4ebdf18-8426-42cc-93a6-60b46261aebe-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 03:50:57 crc kubenswrapper[4685]: I0321 03:50:57.496663 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5zktx\" (UniqueName: \"kubernetes.io/projected/d4ebdf18-8426-42cc-93a6-60b46261aebe-kube-api-access-5zktx\") on node \"crc\" DevicePath \"\"" Mar 21 03:50:57 crc kubenswrapper[4685]: I0321 03:50:57.496672 4685 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4ebdf18-8426-42cc-93a6-60b46261aebe-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 03:50:57 crc kubenswrapper[4685]: I0321 03:50:57.680988 4685 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8qq2k" Mar 21 03:50:57 crc kubenswrapper[4685]: I0321 03:50:57.717959 4685 generic.go:334] "Generic (PLEG): container finished" podID="d4ebdf18-8426-42cc-93a6-60b46261aebe" containerID="e94326960a75656f36838b094d1f74215691d8fcf8da493d4debe7bda468be4d" exitCode=0 Mar 21 03:50:57 crc kubenswrapper[4685]: I0321 03:50:57.718027 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7swf2" event={"ID":"d4ebdf18-8426-42cc-93a6-60b46261aebe","Type":"ContainerDied","Data":"e94326960a75656f36838b094d1f74215691d8fcf8da493d4debe7bda468be4d"} Mar 21 03:50:57 crc kubenswrapper[4685]: I0321 03:50:57.718100 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7swf2" event={"ID":"d4ebdf18-8426-42cc-93a6-60b46261aebe","Type":"ContainerDied","Data":"2801d5cc66b2ed27f43d05ec14001e37a33f3305271ce03aba83a57ba3e6fcac"} Mar 21 03:50:57 crc kubenswrapper[4685]: I0321 03:50:57.718075 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7swf2" Mar 21 03:50:57 crc kubenswrapper[4685]: I0321 03:50:57.718120 4685 scope.go:117] "RemoveContainer" containerID="e94326960a75656f36838b094d1f74215691d8fcf8da493d4debe7bda468be4d" Mar 21 03:50:57 crc kubenswrapper[4685]: I0321 03:50:57.721191 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-75994f449d-jj86b" event={"ID":"ede8a25b-fb4b-4cbc-914d-1fc24155b8f1","Type":"ContainerStarted","Data":"b4bea199b4019ea408a4c73277feb201cbfe2473afe511bae3b416a4fb570d0f"} Mar 21 03:50:57 crc kubenswrapper[4685]: I0321 03:50:57.721262 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-75994f449d-jj86b" event={"ID":"ede8a25b-fb4b-4cbc-914d-1fc24155b8f1","Type":"ContainerStarted","Data":"d2fe2e58c472834e3ddfc65aaef49ca0fdc3eb17e4fc2e024cbf1249f7897d52"} Mar 21 03:50:57 crc kubenswrapper[4685]: I0321 03:50:57.730458 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8qq2k" Mar 21 03:50:57 crc kubenswrapper[4685]: I0321 03:50:57.736867 4685 scope.go:117] "RemoveContainer" containerID="d478a553216614720b07cd59fe60242ee94a243f67d0ea92b29ac48a5c59b7e0" Mar 21 03:50:57 crc kubenswrapper[4685]: I0321 03:50:57.755938 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-75994f449d-jj86b" podStartSLOduration=5.755922289 podStartE2EDuration="5.755922289s" podCreationTimestamp="2026-03-21 03:50:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 03:50:57.745132882 +0000 UTC m=+290.222201714" watchObservedRunningTime="2026-03-21 03:50:57.755922289 +0000 UTC m=+290.232991081" Mar 21 03:50:57 crc kubenswrapper[4685]: I0321 03:50:57.770302 4685 scope.go:117] "RemoveContainer" containerID="8baa23e2996f7d8a9daba457d2de4f70d70272f6a61170df0cdd73c0fef91103" Mar 21 03:50:57 crc kubenswrapper[4685]: I0321 03:50:57.770888 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7swf2"] Mar 21 03:50:57 crc kubenswrapper[4685]: I0321 03:50:57.782094 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7swf2"] Mar 21 03:50:57 crc kubenswrapper[4685]: I0321 03:50:57.789953 4685 scope.go:117] "RemoveContainer" containerID="e94326960a75656f36838b094d1f74215691d8fcf8da493d4debe7bda468be4d" Mar 21 03:50:57 crc kubenswrapper[4685]: E0321 03:50:57.790411 4685 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e94326960a75656f36838b094d1f74215691d8fcf8da493d4debe7bda468be4d\": container with ID starting with e94326960a75656f36838b094d1f74215691d8fcf8da493d4debe7bda468be4d not found: ID does not exist" containerID="e94326960a75656f36838b094d1f74215691d8fcf8da493d4debe7bda468be4d" Mar 21 03:50:57 crc kubenswrapper[4685]: I0321 03:50:57.790464 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e94326960a75656f36838b094d1f74215691d8fcf8da493d4debe7bda468be4d"} err="failed to get container status \"e94326960a75656f36838b094d1f74215691d8fcf8da493d4debe7bda468be4d\": rpc error: code = NotFound desc = could not find container \"e94326960a75656f36838b094d1f74215691d8fcf8da493d4debe7bda468be4d\": container with ID starting with e94326960a75656f36838b094d1f74215691d8fcf8da493d4debe7bda468be4d not found: ID does not exist" Mar 21 03:50:57 crc kubenswrapper[4685]: I0321 03:50:57.790496 4685 scope.go:117] "RemoveContainer" containerID="d478a553216614720b07cd59fe60242ee94a243f67d0ea92b29ac48a5c59b7e0" Mar 21 03:50:57 crc kubenswrapper[4685]: E0321 03:50:57.790756 4685 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d478a553216614720b07cd59fe60242ee94a243f67d0ea92b29ac48a5c59b7e0\": container with ID starting with d478a553216614720b07cd59fe60242ee94a243f67d0ea92b29ac48a5c59b7e0 not found: ID does not exist" containerID="d478a553216614720b07cd59fe60242ee94a243f67d0ea92b29ac48a5c59b7e0" Mar 21 03:50:57 crc kubenswrapper[4685]: I0321 03:50:57.790785 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d478a553216614720b07cd59fe60242ee94a243f67d0ea92b29ac48a5c59b7e0"} err="failed to get container status \"d478a553216614720b07cd59fe60242ee94a243f67d0ea92b29ac48a5c59b7e0\": rpc error: code = NotFound desc = could not find container \"d478a553216614720b07cd59fe60242ee94a243f67d0ea92b29ac48a5c59b7e0\": container with ID starting with d478a553216614720b07cd59fe60242ee94a243f67d0ea92b29ac48a5c59b7e0 not found: ID does not exist" Mar 21 03:50:57 crc kubenswrapper[4685]: I0321 03:50:57.790801 4685 scope.go:117] "RemoveContainer" containerID="8baa23e2996f7d8a9daba457d2de4f70d70272f6a61170df0cdd73c0fef91103" Mar 21 03:50:57 crc kubenswrapper[4685]: E0321 03:50:57.791068 4685 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8baa23e2996f7d8a9daba457d2de4f70d70272f6a61170df0cdd73c0fef91103\": container with ID starting with 8baa23e2996f7d8a9daba457d2de4f70d70272f6a61170df0cdd73c0fef91103 not found: ID does not exist" containerID="8baa23e2996f7d8a9daba457d2de4f70d70272f6a61170df0cdd73c0fef91103" Mar 21 03:50:57 crc kubenswrapper[4685]: I0321 03:50:57.791095 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8baa23e2996f7d8a9daba457d2de4f70d70272f6a61170df0cdd73c0fef91103"} err="failed to get container status \"8baa23e2996f7d8a9daba457d2de4f70d70272f6a61170df0cdd73c0fef91103\": rpc error: code = NotFound desc = could not find container \"8baa23e2996f7d8a9daba457d2de4f70d70272f6a61170df0cdd73c0fef91103\": container with ID starting with 8baa23e2996f7d8a9daba457d2de4f70d70272f6a61170df0cdd73c0fef91103 not found: ID does not exist" Mar 21 03:50:58 crc kubenswrapper[4685]: I0321 03:50:58.308520 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4ebdf18-8426-42cc-93a6-60b46261aebe" path="/var/lib/kubelet/pods/d4ebdf18-8426-42cc-93a6-60b46261aebe/volumes" Mar 21 03:50:58 crc kubenswrapper[4685]: I0321 03:50:58.737331 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-75994f449d-jj86b" Mar 21 03:50:58 crc kubenswrapper[4685]: I0321 03:50:58.745805 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-75994f449d-jj86b" Mar 21 03:50:59 crc kubenswrapper[4685]: I0321 03:50:59.325529 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zrnqq"] Mar 21 03:50:59 crc kubenswrapper[4685]: I0321 03:50:59.325786 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-zrnqq" podUID="f0e0a768-d2ec-4986-8cc7-72f0bd1d285a" containerName="registry-server" containerID="cri-o://31deeee9e2e17aaa1f7fce67d10f5816041023ee773c3b8ac942aa018c0dcde1" gracePeriod=2 Mar 21 03:50:59 crc kubenswrapper[4685]: I0321 03:50:59.744539 4685 generic.go:334] "Generic (PLEG): container finished" podID="f0e0a768-d2ec-4986-8cc7-72f0bd1d285a" containerID="31deeee9e2e17aaa1f7fce67d10f5816041023ee773c3b8ac942aa018c0dcde1" exitCode=0 Mar 21 03:50:59 crc kubenswrapper[4685]: I0321 03:50:59.744623 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zrnqq" event={"ID":"f0e0a768-d2ec-4986-8cc7-72f0bd1d285a","Type":"ContainerDied","Data":"31deeee9e2e17aaa1f7fce67d10f5816041023ee773c3b8ac942aa018c0dcde1"} Mar 21 03:50:59 crc kubenswrapper[4685]: I0321 03:50:59.744965 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zrnqq" event={"ID":"f0e0a768-d2ec-4986-8cc7-72f0bd1d285a","Type":"ContainerDied","Data":"d82770b3325a3670af07adf62156d88411d0f79ae22ac300c93532c72f76106a"} Mar 21 03:50:59 crc kubenswrapper[4685]: I0321 03:50:59.744985 4685 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d82770b3325a3670af07adf62156d88411d0f79ae22ac300c93532c72f76106a" Mar 21 03:50:59 crc kubenswrapper[4685]: I0321 03:50:59.746385 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zrnqq" Mar 21 03:50:59 crc kubenswrapper[4685]: I0321 03:50:59.929718 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0e0a768-d2ec-4986-8cc7-72f0bd1d285a-catalog-content\") pod \"f0e0a768-d2ec-4986-8cc7-72f0bd1d285a\" (UID: \"f0e0a768-d2ec-4986-8cc7-72f0bd1d285a\") " Mar 21 03:50:59 crc kubenswrapper[4685]: I0321 03:50:59.929759 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0e0a768-d2ec-4986-8cc7-72f0bd1d285a-utilities\") pod \"f0e0a768-d2ec-4986-8cc7-72f0bd1d285a\" (UID: \"f0e0a768-d2ec-4986-8cc7-72f0bd1d285a\") " Mar 21 03:50:59 crc kubenswrapper[4685]: I0321 03:50:59.929873 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cz76z\" (UniqueName: \"kubernetes.io/projected/f0e0a768-d2ec-4986-8cc7-72f0bd1d285a-kube-api-access-cz76z\") pod \"f0e0a768-d2ec-4986-8cc7-72f0bd1d285a\" (UID: \"f0e0a768-d2ec-4986-8cc7-72f0bd1d285a\") " Mar 21 03:50:59 crc kubenswrapper[4685]: I0321 03:50:59.931062 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0e0a768-d2ec-4986-8cc7-72f0bd1d285a-utilities" (OuterVolumeSpecName: "utilities") pod "f0e0a768-d2ec-4986-8cc7-72f0bd1d285a" (UID: "f0e0a768-d2ec-4986-8cc7-72f0bd1d285a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 03:50:59 crc kubenswrapper[4685]: I0321 03:50:59.949723 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0e0a768-d2ec-4986-8cc7-72f0bd1d285a-kube-api-access-cz76z" (OuterVolumeSpecName: "kube-api-access-cz76z") pod "f0e0a768-d2ec-4986-8cc7-72f0bd1d285a" (UID: "f0e0a768-d2ec-4986-8cc7-72f0bd1d285a"). InnerVolumeSpecName "kube-api-access-cz76z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:50:59 crc kubenswrapper[4685]: I0321 03:50:59.957537 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0e0a768-d2ec-4986-8cc7-72f0bd1d285a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f0e0a768-d2ec-4986-8cc7-72f0bd1d285a" (UID: "f0e0a768-d2ec-4986-8cc7-72f0bd1d285a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 03:51:00 crc kubenswrapper[4685]: I0321 03:51:00.031474 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cz76z\" (UniqueName: \"kubernetes.io/projected/f0e0a768-d2ec-4986-8cc7-72f0bd1d285a-kube-api-access-cz76z\") on node \"crc\" DevicePath \"\"" Mar 21 03:51:00 crc kubenswrapper[4685]: I0321 03:51:00.031508 4685 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0e0a768-d2ec-4986-8cc7-72f0bd1d285a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 03:51:00 crc kubenswrapper[4685]: I0321 03:51:00.031518 4685 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0e0a768-d2ec-4986-8cc7-72f0bd1d285a-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 03:51:00 crc kubenswrapper[4685]: E0321 03:51:00.384520 4685 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0e0a768_d2ec_4986_8cc7_72f0bd1d285a.slice/crio-d82770b3325a3670af07adf62156d88411d0f79ae22ac300c93532c72f76106a\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0e0a768_d2ec_4986_8cc7_72f0bd1d285a.slice\": RecentStats: unable to find data in memory cache]" Mar 21 03:51:00 crc kubenswrapper[4685]: I0321 03:51:00.752222 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zrnqq" Mar 21 03:51:00 crc kubenswrapper[4685]: I0321 03:51:00.779882 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zrnqq"] Mar 21 03:51:00 crc kubenswrapper[4685]: I0321 03:51:00.784555 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-zrnqq"] Mar 21 03:51:01 crc kubenswrapper[4685]: I0321 03:51:01.730046 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8qq2k"] Mar 21 03:51:01 crc kubenswrapper[4685]: I0321 03:51:01.730427 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8qq2k" podUID="855b8c82-585a-4883-acdc-195377b480c2" containerName="registry-server" containerID="cri-o://f9ee7e5feba13eccb34afa5067ffbeb4ff13d78d67e6a5fafb290b1798c88b2b" gracePeriod=2 Mar 21 03:51:02 crc kubenswrapper[4685]: I0321 03:51:02.143218 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8qq2k" Mar 21 03:51:02 crc kubenswrapper[4685]: I0321 03:51:02.260872 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6kgmg\" (UniqueName: \"kubernetes.io/projected/855b8c82-585a-4883-acdc-195377b480c2-kube-api-access-6kgmg\") pod \"855b8c82-585a-4883-acdc-195377b480c2\" (UID: \"855b8c82-585a-4883-acdc-195377b480c2\") " Mar 21 03:51:02 crc kubenswrapper[4685]: I0321 03:51:02.260928 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/855b8c82-585a-4883-acdc-195377b480c2-utilities\") pod \"855b8c82-585a-4883-acdc-195377b480c2\" (UID: \"855b8c82-585a-4883-acdc-195377b480c2\") " Mar 21 03:51:02 crc kubenswrapper[4685]: I0321 03:51:02.261012 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/855b8c82-585a-4883-acdc-195377b480c2-catalog-content\") pod \"855b8c82-585a-4883-acdc-195377b480c2\" (UID: \"855b8c82-585a-4883-acdc-195377b480c2\") " Mar 21 03:51:02 crc kubenswrapper[4685]: I0321 03:51:02.262072 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/855b8c82-585a-4883-acdc-195377b480c2-utilities" (OuterVolumeSpecName: "utilities") pod "855b8c82-585a-4883-acdc-195377b480c2" (UID: "855b8c82-585a-4883-acdc-195377b480c2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 03:51:02 crc kubenswrapper[4685]: I0321 03:51:02.268098 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/855b8c82-585a-4883-acdc-195377b480c2-kube-api-access-6kgmg" (OuterVolumeSpecName: "kube-api-access-6kgmg") pod "855b8c82-585a-4883-acdc-195377b480c2" (UID: "855b8c82-585a-4883-acdc-195377b480c2"). InnerVolumeSpecName "kube-api-access-6kgmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:51:02 crc kubenswrapper[4685]: I0321 03:51:02.308017 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0e0a768-d2ec-4986-8cc7-72f0bd1d285a" path="/var/lib/kubelet/pods/f0e0a768-d2ec-4986-8cc7-72f0bd1d285a/volumes" Mar 21 03:51:02 crc kubenswrapper[4685]: I0321 03:51:02.362170 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6kgmg\" (UniqueName: \"kubernetes.io/projected/855b8c82-585a-4883-acdc-195377b480c2-kube-api-access-6kgmg\") on node \"crc\" DevicePath \"\"" Mar 21 03:51:02 crc kubenswrapper[4685]: I0321 03:51:02.362201 4685 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/855b8c82-585a-4883-acdc-195377b480c2-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 03:51:02 crc kubenswrapper[4685]: I0321 03:51:02.413555 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/855b8c82-585a-4883-acdc-195377b480c2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "855b8c82-585a-4883-acdc-195377b480c2" (UID: "855b8c82-585a-4883-acdc-195377b480c2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 03:51:02 crc kubenswrapper[4685]: I0321 03:51:02.463008 4685 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/855b8c82-585a-4883-acdc-195377b480c2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 03:51:02 crc kubenswrapper[4685]: I0321 03:51:02.764001 4685 generic.go:334] "Generic (PLEG): container finished" podID="855b8c82-585a-4883-acdc-195377b480c2" containerID="f9ee7e5feba13eccb34afa5067ffbeb4ff13d78d67e6a5fafb290b1798c88b2b" exitCode=0 Mar 21 03:51:02 crc kubenswrapper[4685]: I0321 03:51:02.764042 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8qq2k" event={"ID":"855b8c82-585a-4883-acdc-195377b480c2","Type":"ContainerDied","Data":"f9ee7e5feba13eccb34afa5067ffbeb4ff13d78d67e6a5fafb290b1798c88b2b"} Mar 21 03:51:02 crc kubenswrapper[4685]: I0321 03:51:02.764058 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8qq2k" Mar 21 03:51:02 crc kubenswrapper[4685]: I0321 03:51:02.764071 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8qq2k" event={"ID":"855b8c82-585a-4883-acdc-195377b480c2","Type":"ContainerDied","Data":"9f6147bc58108bfff3269ed69ea4835e65ba044a497c4dc7dd1d1a1c4b9ea31e"} Mar 21 03:51:02 crc kubenswrapper[4685]: I0321 03:51:02.764088 4685 scope.go:117] "RemoveContainer" containerID="f9ee7e5feba13eccb34afa5067ffbeb4ff13d78d67e6a5fafb290b1798c88b2b" Mar 21 03:51:02 crc kubenswrapper[4685]: I0321 03:51:02.780792 4685 scope.go:117] "RemoveContainer" containerID="ff6ba428faf1dd45dd1cca847ad4617672ff88cd427a323bebca1b9be6a3f2d9" Mar 21 03:51:02 crc kubenswrapper[4685]: I0321 03:51:02.789259 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8qq2k"] Mar 21 03:51:02 crc kubenswrapper[4685]: I0321 03:51:02.794915 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8qq2k"] Mar 21 03:51:02 crc kubenswrapper[4685]: I0321 03:51:02.813884 4685 scope.go:117] "RemoveContainer" containerID="821c089f9c6fa6ae58872ec41bd332232ad8efc8e3e1194d906407aa95ebc527" Mar 21 03:51:02 crc kubenswrapper[4685]: I0321 03:51:02.827275 4685 scope.go:117] "RemoveContainer" containerID="f9ee7e5feba13eccb34afa5067ffbeb4ff13d78d67e6a5fafb290b1798c88b2b" Mar 21 03:51:02 crc kubenswrapper[4685]: E0321 03:51:02.827667 4685 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9ee7e5feba13eccb34afa5067ffbeb4ff13d78d67e6a5fafb290b1798c88b2b\": container with ID starting with f9ee7e5feba13eccb34afa5067ffbeb4ff13d78d67e6a5fafb290b1798c88b2b not found: ID does not exist" containerID="f9ee7e5feba13eccb34afa5067ffbeb4ff13d78d67e6a5fafb290b1798c88b2b" Mar 21 03:51:02 crc kubenswrapper[4685]: I0321 03:51:02.827707 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9ee7e5feba13eccb34afa5067ffbeb4ff13d78d67e6a5fafb290b1798c88b2b"} err="failed to get container status \"f9ee7e5feba13eccb34afa5067ffbeb4ff13d78d67e6a5fafb290b1798c88b2b\": rpc error: code = NotFound desc = could not find container \"f9ee7e5feba13eccb34afa5067ffbeb4ff13d78d67e6a5fafb290b1798c88b2b\": container with ID starting with f9ee7e5feba13eccb34afa5067ffbeb4ff13d78d67e6a5fafb290b1798c88b2b not found: ID does not exist" Mar 21 03:51:02 crc kubenswrapper[4685]: I0321 03:51:02.827733 4685 scope.go:117] "RemoveContainer" containerID="ff6ba428faf1dd45dd1cca847ad4617672ff88cd427a323bebca1b9be6a3f2d9" Mar 21 03:51:02 crc kubenswrapper[4685]: E0321 03:51:02.827958 4685 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff6ba428faf1dd45dd1cca847ad4617672ff88cd427a323bebca1b9be6a3f2d9\": container with ID starting with ff6ba428faf1dd45dd1cca847ad4617672ff88cd427a323bebca1b9be6a3f2d9 not found: ID does not exist" containerID="ff6ba428faf1dd45dd1cca847ad4617672ff88cd427a323bebca1b9be6a3f2d9" Mar 21 03:51:02 crc kubenswrapper[4685]: I0321 03:51:02.827998 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff6ba428faf1dd45dd1cca847ad4617672ff88cd427a323bebca1b9be6a3f2d9"} err="failed to get container status \"ff6ba428faf1dd45dd1cca847ad4617672ff88cd427a323bebca1b9be6a3f2d9\": rpc error: code = NotFound desc = could not find container \"ff6ba428faf1dd45dd1cca847ad4617672ff88cd427a323bebca1b9be6a3f2d9\": container with ID starting with ff6ba428faf1dd45dd1cca847ad4617672ff88cd427a323bebca1b9be6a3f2d9 not found: ID does not exist" Mar 21 03:51:02 crc kubenswrapper[4685]: I0321 03:51:02.828019 4685 scope.go:117] "RemoveContainer" containerID="821c089f9c6fa6ae58872ec41bd332232ad8efc8e3e1194d906407aa95ebc527" Mar 21 03:51:02 crc kubenswrapper[4685]: E0321 03:51:02.828254 4685 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"821c089f9c6fa6ae58872ec41bd332232ad8efc8e3e1194d906407aa95ebc527\": container with ID starting with 821c089f9c6fa6ae58872ec41bd332232ad8efc8e3e1194d906407aa95ebc527 not found: ID does not exist" containerID="821c089f9c6fa6ae58872ec41bd332232ad8efc8e3e1194d906407aa95ebc527" Mar 21 03:51:02 crc kubenswrapper[4685]: I0321 03:51:02.828283 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"821c089f9c6fa6ae58872ec41bd332232ad8efc8e3e1194d906407aa95ebc527"} err="failed to get container status \"821c089f9c6fa6ae58872ec41bd332232ad8efc8e3e1194d906407aa95ebc527\": rpc error: code = NotFound desc = could not find container \"821c089f9c6fa6ae58872ec41bd332232ad8efc8e3e1194d906407aa95ebc527\": container with ID starting with 821c089f9c6fa6ae58872ec41bd332232ad8efc8e3e1194d906407aa95ebc527 not found: ID does not exist" Mar 21 03:51:04 crc kubenswrapper[4685]: I0321 03:51:04.308522 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="855b8c82-585a-4883-acdc-195377b480c2" path="/var/lib/kubelet/pods/855b8c82-585a-4883-acdc-195377b480c2/volumes" Mar 21 03:51:05 crc kubenswrapper[4685]: I0321 03:51:05.469473 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-m7q8h"] Mar 21 03:51:12 crc kubenswrapper[4685]: I0321 03:51:12.608114 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-75994f449d-jj86b"] Mar 21 03:51:12 crc kubenswrapper[4685]: I0321 03:51:12.609102 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-75994f449d-jj86b" podUID="ede8a25b-fb4b-4cbc-914d-1fc24155b8f1" containerName="controller-manager" containerID="cri-o://b4bea199b4019ea408a4c73277feb201cbfe2473afe511bae3b416a4fb570d0f" gracePeriod=30 Mar 21 03:51:12 crc kubenswrapper[4685]: I0321 03:51:12.700995 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5bc69c9d55-7trxv"] Mar 21 03:51:12 crc kubenswrapper[4685]: I0321 03:51:12.701481 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5bc69c9d55-7trxv" podUID="de82f429-1aa8-465f-b944-2bfb17d7d26b" containerName="route-controller-manager" containerID="cri-o://34cb40a6d28fd95aa004fa6ca3edfc26871731c6adc4c21264ccaa096616a292" gracePeriod=30 Mar 21 03:51:12 crc kubenswrapper[4685]: I0321 03:51:12.875338 4685 generic.go:334] "Generic (PLEG): container finished" podID="de82f429-1aa8-465f-b944-2bfb17d7d26b" containerID="34cb40a6d28fd95aa004fa6ca3edfc26871731c6adc4c21264ccaa096616a292" exitCode=0 Mar 21 03:51:12 crc kubenswrapper[4685]: I0321 03:51:12.875434 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5bc69c9d55-7trxv" event={"ID":"de82f429-1aa8-465f-b944-2bfb17d7d26b","Type":"ContainerDied","Data":"34cb40a6d28fd95aa004fa6ca3edfc26871731c6adc4c21264ccaa096616a292"} Mar 21 03:51:12 crc kubenswrapper[4685]: I0321 03:51:12.878898 4685 generic.go:334] "Generic (PLEG): container finished" podID="ede8a25b-fb4b-4cbc-914d-1fc24155b8f1" containerID="b4bea199b4019ea408a4c73277feb201cbfe2473afe511bae3b416a4fb570d0f" exitCode=0 Mar 21 03:51:12 crc kubenswrapper[4685]: I0321 03:51:12.878968 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-75994f449d-jj86b" event={"ID":"ede8a25b-fb4b-4cbc-914d-1fc24155b8f1","Type":"ContainerDied","Data":"b4bea199b4019ea408a4c73277feb201cbfe2473afe511bae3b416a4fb570d0f"} Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.271902 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5bc69c9d55-7trxv" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.281098 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-75994f449d-jj86b" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.300037 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de82f429-1aa8-465f-b944-2bfb17d7d26b-config\") pod \"de82f429-1aa8-465f-b944-2bfb17d7d26b\" (UID: \"de82f429-1aa8-465f-b944-2bfb17d7d26b\") " Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.300243 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fwq2\" (UniqueName: \"kubernetes.io/projected/de82f429-1aa8-465f-b944-2bfb17d7d26b-kube-api-access-9fwq2\") pod \"de82f429-1aa8-465f-b944-2bfb17d7d26b\" (UID: \"de82f429-1aa8-465f-b944-2bfb17d7d26b\") " Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.300290 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/de82f429-1aa8-465f-b944-2bfb17d7d26b-client-ca\") pod \"de82f429-1aa8-465f-b944-2bfb17d7d26b\" (UID: \"de82f429-1aa8-465f-b944-2bfb17d7d26b\") " Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.300351 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/de82f429-1aa8-465f-b944-2bfb17d7d26b-serving-cert\") pod \"de82f429-1aa8-465f-b944-2bfb17d7d26b\" (UID: \"de82f429-1aa8-465f-b944-2bfb17d7d26b\") " Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.301801 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de82f429-1aa8-465f-b944-2bfb17d7d26b-client-ca" (OuterVolumeSpecName: "client-ca") pod "de82f429-1aa8-465f-b944-2bfb17d7d26b" (UID: "de82f429-1aa8-465f-b944-2bfb17d7d26b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.303066 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de82f429-1aa8-465f-b944-2bfb17d7d26b-config" (OuterVolumeSpecName: "config") pod "de82f429-1aa8-465f-b944-2bfb17d7d26b" (UID: "de82f429-1aa8-465f-b944-2bfb17d7d26b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.308620 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de82f429-1aa8-465f-b944-2bfb17d7d26b-kube-api-access-9fwq2" (OuterVolumeSpecName: "kube-api-access-9fwq2") pod "de82f429-1aa8-465f-b944-2bfb17d7d26b" (UID: "de82f429-1aa8-465f-b944-2bfb17d7d26b"). InnerVolumeSpecName "kube-api-access-9fwq2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.308713 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de82f429-1aa8-465f-b944-2bfb17d7d26b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "de82f429-1aa8-465f-b944-2bfb17d7d26b" (UID: "de82f429-1aa8-465f-b944-2bfb17d7d26b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.358130 4685 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 21 03:51:13 crc kubenswrapper[4685]: E0321 03:51:13.358475 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0e0a768-d2ec-4986-8cc7-72f0bd1d285a" containerName="extract-utilities" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.358498 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0e0a768-d2ec-4986-8cc7-72f0bd1d285a" containerName="extract-utilities" Mar 21 03:51:13 crc kubenswrapper[4685]: E0321 03:51:13.358514 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4ebdf18-8426-42cc-93a6-60b46261aebe" containerName="extract-utilities" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.358523 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4ebdf18-8426-42cc-93a6-60b46261aebe" containerName="extract-utilities" Mar 21 03:51:13 crc kubenswrapper[4685]: E0321 03:51:13.358539 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4ebdf18-8426-42cc-93a6-60b46261aebe" containerName="registry-server" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.358547 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4ebdf18-8426-42cc-93a6-60b46261aebe" containerName="registry-server" Mar 21 03:51:13 crc kubenswrapper[4685]: E0321 03:51:13.358560 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0e0a768-d2ec-4986-8cc7-72f0bd1d285a" containerName="extract-content" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.358567 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0e0a768-d2ec-4986-8cc7-72f0bd1d285a" containerName="extract-content" Mar 21 03:51:13 crc kubenswrapper[4685]: E0321 03:51:13.358580 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="855b8c82-585a-4883-acdc-195377b480c2" containerName="registry-server" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.358587 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="855b8c82-585a-4883-acdc-195377b480c2" containerName="registry-server" Mar 21 03:51:13 crc kubenswrapper[4685]: E0321 03:51:13.358602 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0e0a768-d2ec-4986-8cc7-72f0bd1d285a" containerName="registry-server" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.358610 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0e0a768-d2ec-4986-8cc7-72f0bd1d285a" containerName="registry-server" Mar 21 03:51:13 crc kubenswrapper[4685]: E0321 03:51:13.358620 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ede8a25b-fb4b-4cbc-914d-1fc24155b8f1" containerName="controller-manager" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.358629 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="ede8a25b-fb4b-4cbc-914d-1fc24155b8f1" containerName="controller-manager" Mar 21 03:51:13 crc kubenswrapper[4685]: E0321 03:51:13.358639 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de82f429-1aa8-465f-b944-2bfb17d7d26b" containerName="route-controller-manager" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.358648 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="de82f429-1aa8-465f-b944-2bfb17d7d26b" containerName="route-controller-manager" Mar 21 03:51:13 crc kubenswrapper[4685]: E0321 03:51:13.358663 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="855b8c82-585a-4883-acdc-195377b480c2" containerName="extract-utilities" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.358671 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="855b8c82-585a-4883-acdc-195377b480c2" containerName="extract-utilities" Mar 21 03:51:13 crc kubenswrapper[4685]: E0321 03:51:13.358682 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="855b8c82-585a-4883-acdc-195377b480c2" containerName="extract-content" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.358689 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="855b8c82-585a-4883-acdc-195377b480c2" containerName="extract-content" Mar 21 03:51:13 crc kubenswrapper[4685]: E0321 03:51:13.358701 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4ebdf18-8426-42cc-93a6-60b46261aebe" containerName="extract-content" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.358708 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4ebdf18-8426-42cc-93a6-60b46261aebe" containerName="extract-content" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.358870 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0e0a768-d2ec-4986-8cc7-72f0bd1d285a" containerName="registry-server" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.358883 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4ebdf18-8426-42cc-93a6-60b46261aebe" containerName="registry-server" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.358895 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="ede8a25b-fb4b-4cbc-914d-1fc24155b8f1" containerName="controller-manager" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.358907 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="de82f429-1aa8-465f-b944-2bfb17d7d26b" containerName="route-controller-manager" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.358917 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="855b8c82-585a-4883-acdc-195377b480c2" containerName="registry-server" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.359237 4685 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.359431 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.359660 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://cf0ba4af6f1f48fcb9ccf07dec53dc3ff1835a83dbf535bc48feb68fd646e78f" gracePeriod=15 Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.360342 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://ba587c4fe2f05966282b50ba5236b9f3d9ef6de63f72c70ae9f7a5222cb8b904" gracePeriod=15 Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.360392 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://60c96edd458d05f217a2e9f07a44bd221303d821a790382a82cff0b912d48f63" gracePeriod=15 Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.360396 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://a8583e9b5dff82d5df52e281ba4069e9259b1c8fe3d1b8121d0e9f3f9e97d47b" gracePeriod=15 Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.360437 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://182ad351b6c632ea64087d4784ea919d5c21165dc5d0373fa35db7f7f1eea435" gracePeriod=15 Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.361161 4685 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 21 03:51:13 crc kubenswrapper[4685]: E0321 03:51:13.361297 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.361315 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 21 03:51:13 crc kubenswrapper[4685]: E0321 03:51:13.361324 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.361334 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 21 03:51:13 crc kubenswrapper[4685]: E0321 03:51:13.361345 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.361353 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 21 03:51:13 crc kubenswrapper[4685]: E0321 03:51:13.361361 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.361369 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 21 03:51:13 crc kubenswrapper[4685]: E0321 03:51:13.361380 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.361388 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 21 03:51:13 crc kubenswrapper[4685]: E0321 03:51:13.361397 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.361405 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 21 03:51:13 crc kubenswrapper[4685]: E0321 03:51:13.361417 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.361424 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 21 03:51:13 crc kubenswrapper[4685]: E0321 03:51:13.361439 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.361447 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 21 03:51:13 crc kubenswrapper[4685]: E0321 03:51:13.361457 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.361465 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.361569 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.361579 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.361588 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.361597 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.361606 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.361618 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.361628 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 21 03:51:13 crc kubenswrapper[4685]: E0321 03:51:13.361730 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.361739 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.361945 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.361956 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.401669 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ede8a25b-fb4b-4cbc-914d-1fc24155b8f1-proxy-ca-bundles\") pod \"ede8a25b-fb4b-4cbc-914d-1fc24155b8f1\" (UID: \"ede8a25b-fb4b-4cbc-914d-1fc24155b8f1\") " Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.401758 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ede8a25b-fb4b-4cbc-914d-1fc24155b8f1-serving-cert\") pod \"ede8a25b-fb4b-4cbc-914d-1fc24155b8f1\" (UID: \"ede8a25b-fb4b-4cbc-914d-1fc24155b8f1\") " Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.401811 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ede8a25b-fb4b-4cbc-914d-1fc24155b8f1-client-ca\") pod \"ede8a25b-fb4b-4cbc-914d-1fc24155b8f1\" (UID: \"ede8a25b-fb4b-4cbc-914d-1fc24155b8f1\") " Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.401932 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ede8a25b-fb4b-4cbc-914d-1fc24155b8f1-config\") pod \"ede8a25b-fb4b-4cbc-914d-1fc24155b8f1\" (UID: \"ede8a25b-fb4b-4cbc-914d-1fc24155b8f1\") " Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.402040 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r959n\" (UniqueName: \"kubernetes.io/projected/ede8a25b-fb4b-4cbc-914d-1fc24155b8f1-kube-api-access-r959n\") pod \"ede8a25b-fb4b-4cbc-914d-1fc24155b8f1\" (UID: \"ede8a25b-fb4b-4cbc-914d-1fc24155b8f1\") " Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.402285 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.402357 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.402399 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.402467 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.402491 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ede8a25b-fb4b-4cbc-914d-1fc24155b8f1-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "ede8a25b-fb4b-4cbc-914d-1fc24155b8f1" (UID: "ede8a25b-fb4b-4cbc-914d-1fc24155b8f1"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.402546 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.402614 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.402651 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.402725 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.402822 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ede8a25b-fb4b-4cbc-914d-1fc24155b8f1-client-ca" (OuterVolumeSpecName: "client-ca") pod "ede8a25b-fb4b-4cbc-914d-1fc24155b8f1" (UID: "ede8a25b-fb4b-4cbc-914d-1fc24155b8f1"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.402861 4685 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de82f429-1aa8-465f-b944-2bfb17d7d26b-config\") on node \"crc\" DevicePath \"\"" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.402888 4685 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ede8a25b-fb4b-4cbc-914d-1fc24155b8f1-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.402924 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9fwq2\" (UniqueName: \"kubernetes.io/projected/de82f429-1aa8-465f-b944-2bfb17d7d26b-kube-api-access-9fwq2\") on node \"crc\" DevicePath \"\"" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.402942 4685 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/de82f429-1aa8-465f-b944-2bfb17d7d26b-client-ca\") on node \"crc\" DevicePath \"\"" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.402954 4685 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/de82f429-1aa8-465f-b944-2bfb17d7d26b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.403400 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.405784 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ede8a25b-fb4b-4cbc-914d-1fc24155b8f1-config" (OuterVolumeSpecName: "config") pod "ede8a25b-fb4b-4cbc-914d-1fc24155b8f1" (UID: "ede8a25b-fb4b-4cbc-914d-1fc24155b8f1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.407279 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ede8a25b-fb4b-4cbc-914d-1fc24155b8f1-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ede8a25b-fb4b-4cbc-914d-1fc24155b8f1" (UID: "ede8a25b-fb4b-4cbc-914d-1fc24155b8f1"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.407418 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ede8a25b-fb4b-4cbc-914d-1fc24155b8f1-kube-api-access-r959n" (OuterVolumeSpecName: "kube-api-access-r959n") pod "ede8a25b-fb4b-4cbc-914d-1fc24155b8f1" (UID: "ede8a25b-fb4b-4cbc-914d-1fc24155b8f1"). InnerVolumeSpecName "kube-api-access-r959n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.504143 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.504198 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.504230 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.504256 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.504288 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.504316 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.504330 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.504359 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.504410 4685 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ede8a25b-fb4b-4cbc-914d-1fc24155b8f1-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.504427 4685 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ede8a25b-fb4b-4cbc-914d-1fc24155b8f1-client-ca\") on node \"crc\" DevicePath \"\"" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.504437 4685 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ede8a25b-fb4b-4cbc-914d-1fc24155b8f1-config\") on node \"crc\" DevicePath \"\"" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.504449 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r959n\" (UniqueName: \"kubernetes.io/projected/ede8a25b-fb4b-4cbc-914d-1fc24155b8f1-kube-api-access-r959n\") on node \"crc\" DevicePath \"\"" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.504510 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.504554 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.504582 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.504608 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.504633 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.504654 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.504676 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.504703 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.699818 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 03:51:13 crc kubenswrapper[4685]: W0321 03:51:13.732602 4685 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-e86fa125ff2856652dd1947acf2cbcfadc4a9fd4a7f59f597fd42f61404d00a3 WatchSource:0}: Error finding container e86fa125ff2856652dd1947acf2cbcfadc4a9fd4a7f59f597fd42f61404d00a3: Status 404 returned error can't find the container with id e86fa125ff2856652dd1947acf2cbcfadc4a9fd4a7f59f597fd42f61404d00a3 Mar 21 03:51:13 crc kubenswrapper[4685]: E0321 03:51:13.737264 4685 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.158:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189ebec90074fa75 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:51:13.735924341 +0000 UTC m=+306.212993173,LastTimestamp:2026-03-21 03:51:13.735924341 +0000 UTC m=+306.212993173,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.888309 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-75994f449d-jj86b" event={"ID":"ede8a25b-fb4b-4cbc-914d-1fc24155b8f1","Type":"ContainerDied","Data":"d2fe2e58c472834e3ddfc65aaef49ca0fdc3eb17e4fc2e024cbf1249f7897d52"} Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.888362 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-75994f449d-jj86b" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.888373 4685 scope.go:117] "RemoveContainer" containerID="b4bea199b4019ea408a4c73277feb201cbfe2473afe511bae3b416a4fb570d0f" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.890087 4685 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.890491 4685 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.891037 4685 status_manager.go:851] "Failed to get status for pod" podUID="ede8a25b-fb4b-4cbc-914d-1fc24155b8f1" pod="openshift-controller-manager/controller-manager-75994f449d-jj86b" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-75994f449d-jj86b\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.891466 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"e86fa125ff2856652dd1947acf2cbcfadc4a9fd4a7f59f597fd42f61404d00a3"} Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.896728 4685 generic.go:334] "Generic (PLEG): container finished" podID="7b92b45a-4c33-4463-96c1-0d4227d1d118" containerID="8685673bd128e91f0736b69d2792898a866d876fcab17106ff676fa213f77a21" exitCode=0 Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.896822 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"7b92b45a-4c33-4463-96c1-0d4227d1d118","Type":"ContainerDied","Data":"8685673bd128e91f0736b69d2792898a866d876fcab17106ff676fa213f77a21"} Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.898334 4685 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.899185 4685 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.899524 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.899565 4685 status_manager.go:851] "Failed to get status for pod" podUID="ede8a25b-fb4b-4cbc-914d-1fc24155b8f1" pod="openshift-controller-manager/controller-manager-75994f449d-jj86b" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-75994f449d-jj86b\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.899909 4685 status_manager.go:851] "Failed to get status for pod" podUID="7b92b45a-4c33-4463-96c1-0d4227d1d118" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.901434 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.902679 4685 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="cf0ba4af6f1f48fcb9ccf07dec53dc3ff1835a83dbf535bc48feb68fd646e78f" exitCode=0 Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.902708 4685 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="182ad351b6c632ea64087d4784ea919d5c21165dc5d0373fa35db7f7f1eea435" exitCode=0 Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.902720 4685 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="60c96edd458d05f217a2e9f07a44bd221303d821a790382a82cff0b912d48f63" exitCode=0 Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.902731 4685 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a8583e9b5dff82d5df52e281ba4069e9259b1c8fe3d1b8121d0e9f3f9e97d47b" exitCode=2 Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.906565 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5bc69c9d55-7trxv" event={"ID":"de82f429-1aa8-465f-b944-2bfb17d7d26b","Type":"ContainerDied","Data":"08608c8d3f66aa37fcbc6a66679212d0d22022972aa9c737ca6d721ee36f9883"} Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.906655 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5bc69c9d55-7trxv" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.907372 4685 status_manager.go:851] "Failed to get status for pod" podUID="de82f429-1aa8-465f-b944-2bfb17d7d26b" pod="openshift-route-controller-manager/route-controller-manager-5bc69c9d55-7trxv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-5bc69c9d55-7trxv\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.908165 4685 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.908451 4685 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.908633 4685 status_manager.go:851] "Failed to get status for pod" podUID="ede8a25b-fb4b-4cbc-914d-1fc24155b8f1" pod="openshift-controller-manager/controller-manager-75994f449d-jj86b" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-75994f449d-jj86b\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.908788 4685 status_manager.go:851] "Failed to get status for pod" podUID="7b92b45a-4c33-4463-96c1-0d4227d1d118" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.909017 4685 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.909179 4685 status_manager.go:851] "Failed to get status for pod" podUID="ede8a25b-fb4b-4cbc-914d-1fc24155b8f1" pod="openshift-controller-manager/controller-manager-75994f449d-jj86b" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-75994f449d-jj86b\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.909319 4685 status_manager.go:851] "Failed to get status for pod" podUID="7b92b45a-4c33-4463-96c1-0d4227d1d118" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.909501 4685 status_manager.go:851] "Failed to get status for pod" podUID="de82f429-1aa8-465f-b944-2bfb17d7d26b" pod="openshift-route-controller-manager/route-controller-manager-5bc69c9d55-7trxv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-5bc69c9d55-7trxv\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.909677 4685 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.920443 4685 scope.go:117] "RemoveContainer" containerID="c896022c1e0726ccc5a54da9c432a0ff4082158ad76abcea60c0a1427c69134b" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.966907 4685 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.967341 4685 status_manager.go:851] "Failed to get status for pod" podUID="ede8a25b-fb4b-4cbc-914d-1fc24155b8f1" pod="openshift-controller-manager/controller-manager-75994f449d-jj86b" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-75994f449d-jj86b\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.967622 4685 status_manager.go:851] "Failed to get status for pod" podUID="7b92b45a-4c33-4463-96c1-0d4227d1d118" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.967908 4685 status_manager.go:851] "Failed to get status for pod" podUID="de82f429-1aa8-465f-b944-2bfb17d7d26b" pod="openshift-route-controller-manager/route-controller-manager-5bc69c9d55-7trxv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-5bc69c9d55-7trxv\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.968179 4685 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 21 03:51:13 crc kubenswrapper[4685]: I0321 03:51:13.974757 4685 scope.go:117] "RemoveContainer" containerID="34cb40a6d28fd95aa004fa6ca3edfc26871731c6adc4c21264ccaa096616a292" Mar 21 03:51:14 crc kubenswrapper[4685]: I0321 03:51:14.921555 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"ef0e430dc19ca6925ec78663fdc64f0e2ce9c40649685826ca31d883109b727c"} Mar 21 03:51:14 crc kubenswrapper[4685]: I0321 03:51:14.922987 4685 status_manager.go:851] "Failed to get status for pod" podUID="7b92b45a-4c33-4463-96c1-0d4227d1d118" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 21 03:51:14 crc kubenswrapper[4685]: I0321 03:51:14.923455 4685 status_manager.go:851] "Failed to get status for pod" podUID="de82f429-1aa8-465f-b944-2bfb17d7d26b" pod="openshift-route-controller-manager/route-controller-manager-5bc69c9d55-7trxv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-5bc69c9d55-7trxv\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 21 03:51:14 crc kubenswrapper[4685]: I0321 03:51:14.925311 4685 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 21 03:51:14 crc kubenswrapper[4685]: I0321 03:51:14.926308 4685 status_manager.go:851] "Failed to get status for pod" podUID="ede8a25b-fb4b-4cbc-914d-1fc24155b8f1" pod="openshift-controller-manager/controller-manager-75994f449d-jj86b" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-75994f449d-jj86b\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 21 03:51:14 crc kubenswrapper[4685]: I0321 03:51:14.928811 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 21 03:51:15 crc kubenswrapper[4685]: I0321 03:51:15.368557 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 21 03:51:15 crc kubenswrapper[4685]: I0321 03:51:15.369874 4685 status_manager.go:851] "Failed to get status for pod" podUID="de82f429-1aa8-465f-b944-2bfb17d7d26b" pod="openshift-route-controller-manager/route-controller-manager-5bc69c9d55-7trxv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-5bc69c9d55-7trxv\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 21 03:51:15 crc kubenswrapper[4685]: I0321 03:51:15.370373 4685 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 21 03:51:15 crc kubenswrapper[4685]: I0321 03:51:15.370923 4685 status_manager.go:851] "Failed to get status for pod" podUID="ede8a25b-fb4b-4cbc-914d-1fc24155b8f1" pod="openshift-controller-manager/controller-manager-75994f449d-jj86b" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-75994f449d-jj86b\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 21 03:51:15 crc kubenswrapper[4685]: I0321 03:51:15.371305 4685 status_manager.go:851] "Failed to get status for pod" podUID="7b92b45a-4c33-4463-96c1-0d4227d1d118" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 21 03:51:15 crc kubenswrapper[4685]: I0321 03:51:15.433481 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7b92b45a-4c33-4463-96c1-0d4227d1d118-kube-api-access\") pod \"7b92b45a-4c33-4463-96c1-0d4227d1d118\" (UID: \"7b92b45a-4c33-4463-96c1-0d4227d1d118\") " Mar 21 03:51:15 crc kubenswrapper[4685]: I0321 03:51:15.433566 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7b92b45a-4c33-4463-96c1-0d4227d1d118-var-lock\") pod \"7b92b45a-4c33-4463-96c1-0d4227d1d118\" (UID: \"7b92b45a-4c33-4463-96c1-0d4227d1d118\") " Mar 21 03:51:15 crc kubenswrapper[4685]: I0321 03:51:15.433587 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7b92b45a-4c33-4463-96c1-0d4227d1d118-kubelet-dir\") pod \"7b92b45a-4c33-4463-96c1-0d4227d1d118\" (UID: \"7b92b45a-4c33-4463-96c1-0d4227d1d118\") " Mar 21 03:51:15 crc kubenswrapper[4685]: I0321 03:51:15.433636 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7b92b45a-4c33-4463-96c1-0d4227d1d118-var-lock" (OuterVolumeSpecName: "var-lock") pod "7b92b45a-4c33-4463-96c1-0d4227d1d118" (UID: "7b92b45a-4c33-4463-96c1-0d4227d1d118"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 03:51:15 crc kubenswrapper[4685]: I0321 03:51:15.433696 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7b92b45a-4c33-4463-96c1-0d4227d1d118-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "7b92b45a-4c33-4463-96c1-0d4227d1d118" (UID: "7b92b45a-4c33-4463-96c1-0d4227d1d118"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 03:51:15 crc kubenswrapper[4685]: I0321 03:51:15.433910 4685 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7b92b45a-4c33-4463-96c1-0d4227d1d118-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 21 03:51:15 crc kubenswrapper[4685]: I0321 03:51:15.433927 4685 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7b92b45a-4c33-4463-96c1-0d4227d1d118-var-lock\") on node \"crc\" DevicePath \"\"" Mar 21 03:51:15 crc kubenswrapper[4685]: I0321 03:51:15.497289 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b92b45a-4c33-4463-96c1-0d4227d1d118-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "7b92b45a-4c33-4463-96c1-0d4227d1d118" (UID: "7b92b45a-4c33-4463-96c1-0d4227d1d118"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:51:15 crc kubenswrapper[4685]: I0321 03:51:15.535159 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7b92b45a-4c33-4463-96c1-0d4227d1d118-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 21 03:51:15 crc kubenswrapper[4685]: I0321 03:51:15.759363 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 21 03:51:15 crc kubenswrapper[4685]: I0321 03:51:15.760569 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 03:51:15 crc kubenswrapper[4685]: I0321 03:51:15.761690 4685 status_manager.go:851] "Failed to get status for pod" podUID="7b92b45a-4c33-4463-96c1-0d4227d1d118" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 21 03:51:15 crc kubenswrapper[4685]: I0321 03:51:15.762294 4685 status_manager.go:851] "Failed to get status for pod" podUID="de82f429-1aa8-465f-b944-2bfb17d7d26b" pod="openshift-route-controller-manager/route-controller-manager-5bc69c9d55-7trxv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-5bc69c9d55-7trxv\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 21 03:51:15 crc kubenswrapper[4685]: I0321 03:51:15.763276 4685 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 21 03:51:15 crc kubenswrapper[4685]: I0321 03:51:15.763659 4685 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 21 03:51:15 crc kubenswrapper[4685]: I0321 03:51:15.764328 4685 status_manager.go:851] "Failed to get status for pod" podUID="ede8a25b-fb4b-4cbc-914d-1fc24155b8f1" pod="openshift-controller-manager/controller-manager-75994f449d-jj86b" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-75994f449d-jj86b\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 21 03:51:15 crc kubenswrapper[4685]: I0321 03:51:15.839623 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 21 03:51:15 crc kubenswrapper[4685]: I0321 03:51:15.839737 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 03:51:15 crc kubenswrapper[4685]: I0321 03:51:15.839778 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 21 03:51:15 crc kubenswrapper[4685]: I0321 03:51:15.839874 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 03:51:15 crc kubenswrapper[4685]: I0321 03:51:15.839948 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 21 03:51:15 crc kubenswrapper[4685]: I0321 03:51:15.840022 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 03:51:15 crc kubenswrapper[4685]: I0321 03:51:15.840663 4685 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 21 03:51:15 crc kubenswrapper[4685]: I0321 03:51:15.840717 4685 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 21 03:51:15 crc kubenswrapper[4685]: I0321 03:51:15.840737 4685 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 21 03:51:15 crc kubenswrapper[4685]: I0321 03:51:15.943222 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"7b92b45a-4c33-4463-96c1-0d4227d1d118","Type":"ContainerDied","Data":"e11b6aaad494ccf25935123fbf6b04d5f96caf0dc48dcb7c61293c4b47e05eed"} Mar 21 03:51:15 crc kubenswrapper[4685]: I0321 03:51:15.943800 4685 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e11b6aaad494ccf25935123fbf6b04d5f96caf0dc48dcb7c61293c4b47e05eed" Mar 21 03:51:15 crc kubenswrapper[4685]: I0321 03:51:15.945520 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 21 03:51:15 crc kubenswrapper[4685]: I0321 03:51:15.948285 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 21 03:51:15 crc kubenswrapper[4685]: I0321 03:51:15.949662 4685 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ba587c4fe2f05966282b50ba5236b9f3d9ef6de63f72c70ae9f7a5222cb8b904" exitCode=0 Mar 21 03:51:15 crc kubenswrapper[4685]: I0321 03:51:15.949895 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 03:51:15 crc kubenswrapper[4685]: I0321 03:51:15.949963 4685 scope.go:117] "RemoveContainer" containerID="cf0ba4af6f1f48fcb9ccf07dec53dc3ff1835a83dbf535bc48feb68fd646e78f" Mar 21 03:51:15 crc kubenswrapper[4685]: I0321 03:51:15.979555 4685 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 21 03:51:15 crc kubenswrapper[4685]: I0321 03:51:15.980200 4685 status_manager.go:851] "Failed to get status for pod" podUID="ede8a25b-fb4b-4cbc-914d-1fc24155b8f1" pod="openshift-controller-manager/controller-manager-75994f449d-jj86b" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-75994f449d-jj86b\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 21 03:51:15 crc kubenswrapper[4685]: I0321 03:51:15.980697 4685 status_manager.go:851] "Failed to get status for pod" podUID="7b92b45a-4c33-4463-96c1-0d4227d1d118" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 21 03:51:15 crc kubenswrapper[4685]: I0321 03:51:15.981375 4685 status_manager.go:851] "Failed to get status for pod" podUID="de82f429-1aa8-465f-b944-2bfb17d7d26b" pod="openshift-route-controller-manager/route-controller-manager-5bc69c9d55-7trxv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-5bc69c9d55-7trxv\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 21 03:51:15 crc kubenswrapper[4685]: I0321 03:51:15.982346 4685 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 21 03:51:15 crc kubenswrapper[4685]: I0321 03:51:15.982473 4685 scope.go:117] "RemoveContainer" containerID="182ad351b6c632ea64087d4784ea919d5c21165dc5d0373fa35db7f7f1eea435" Mar 21 03:51:15 crc kubenswrapper[4685]: I0321 03:51:15.983056 4685 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 21 03:51:15 crc kubenswrapper[4685]: I0321 03:51:15.983516 4685 status_manager.go:851] "Failed to get status for pod" podUID="ede8a25b-fb4b-4cbc-914d-1fc24155b8f1" pod="openshift-controller-manager/controller-manager-75994f449d-jj86b" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-75994f449d-jj86b\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 21 03:51:15 crc kubenswrapper[4685]: I0321 03:51:15.984137 4685 status_manager.go:851] "Failed to get status for pod" podUID="7b92b45a-4c33-4463-96c1-0d4227d1d118" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 21 03:51:15 crc kubenswrapper[4685]: I0321 03:51:15.984811 4685 status_manager.go:851] "Failed to get status for pod" podUID="de82f429-1aa8-465f-b944-2bfb17d7d26b" pod="openshift-route-controller-manager/route-controller-manager-5bc69c9d55-7trxv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-5bc69c9d55-7trxv\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 21 03:51:15 crc kubenswrapper[4685]: I0321 03:51:15.985443 4685 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 21 03:51:16 crc kubenswrapper[4685]: I0321 03:51:16.016458 4685 scope.go:117] "RemoveContainer" containerID="60c96edd458d05f217a2e9f07a44bd221303d821a790382a82cff0b912d48f63" Mar 21 03:51:16 crc kubenswrapper[4685]: I0321 03:51:16.045074 4685 scope.go:117] "RemoveContainer" containerID="a8583e9b5dff82d5df52e281ba4069e9259b1c8fe3d1b8121d0e9f3f9e97d47b" Mar 21 03:51:16 crc kubenswrapper[4685]: I0321 03:51:16.069514 4685 scope.go:117] "RemoveContainer" containerID="ba587c4fe2f05966282b50ba5236b9f3d9ef6de63f72c70ae9f7a5222cb8b904" Mar 21 03:51:16 crc kubenswrapper[4685]: I0321 03:51:16.091003 4685 scope.go:117] "RemoveContainer" containerID="52df5756c4d3e8d1087a5a38b3e2b9966fb492d666c43773aa12a4cd83f6158d" Mar 21 03:51:16 crc kubenswrapper[4685]: I0321 03:51:16.116136 4685 scope.go:117] "RemoveContainer" containerID="cf0ba4af6f1f48fcb9ccf07dec53dc3ff1835a83dbf535bc48feb68fd646e78f" Mar 21 03:51:16 crc kubenswrapper[4685]: E0321 03:51:16.116961 4685 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf0ba4af6f1f48fcb9ccf07dec53dc3ff1835a83dbf535bc48feb68fd646e78f\": container with ID starting with cf0ba4af6f1f48fcb9ccf07dec53dc3ff1835a83dbf535bc48feb68fd646e78f not found: ID does not exist" containerID="cf0ba4af6f1f48fcb9ccf07dec53dc3ff1835a83dbf535bc48feb68fd646e78f" Mar 21 03:51:16 crc kubenswrapper[4685]: I0321 03:51:16.116997 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf0ba4af6f1f48fcb9ccf07dec53dc3ff1835a83dbf535bc48feb68fd646e78f"} err="failed to get container status \"cf0ba4af6f1f48fcb9ccf07dec53dc3ff1835a83dbf535bc48feb68fd646e78f\": rpc error: code = NotFound desc = could not find container \"cf0ba4af6f1f48fcb9ccf07dec53dc3ff1835a83dbf535bc48feb68fd646e78f\": container with ID starting with cf0ba4af6f1f48fcb9ccf07dec53dc3ff1835a83dbf535bc48feb68fd646e78f not found: ID does not exist" Mar 21 03:51:16 crc kubenswrapper[4685]: I0321 03:51:16.117021 4685 scope.go:117] "RemoveContainer" containerID="182ad351b6c632ea64087d4784ea919d5c21165dc5d0373fa35db7f7f1eea435" Mar 21 03:51:16 crc kubenswrapper[4685]: E0321 03:51:16.117391 4685 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"182ad351b6c632ea64087d4784ea919d5c21165dc5d0373fa35db7f7f1eea435\": container with ID starting with 182ad351b6c632ea64087d4784ea919d5c21165dc5d0373fa35db7f7f1eea435 not found: ID does not exist" containerID="182ad351b6c632ea64087d4784ea919d5c21165dc5d0373fa35db7f7f1eea435" Mar 21 03:51:16 crc kubenswrapper[4685]: I0321 03:51:16.117431 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"182ad351b6c632ea64087d4784ea919d5c21165dc5d0373fa35db7f7f1eea435"} err="failed to get container status \"182ad351b6c632ea64087d4784ea919d5c21165dc5d0373fa35db7f7f1eea435\": rpc error: code = NotFound desc = could not find container \"182ad351b6c632ea64087d4784ea919d5c21165dc5d0373fa35db7f7f1eea435\": container with ID starting with 182ad351b6c632ea64087d4784ea919d5c21165dc5d0373fa35db7f7f1eea435 not found: ID does not exist" Mar 21 03:51:16 crc kubenswrapper[4685]: I0321 03:51:16.117460 4685 scope.go:117] "RemoveContainer" containerID="60c96edd458d05f217a2e9f07a44bd221303d821a790382a82cff0b912d48f63" Mar 21 03:51:16 crc kubenswrapper[4685]: E0321 03:51:16.117723 4685 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60c96edd458d05f217a2e9f07a44bd221303d821a790382a82cff0b912d48f63\": container with ID starting with 60c96edd458d05f217a2e9f07a44bd221303d821a790382a82cff0b912d48f63 not found: ID does not exist" containerID="60c96edd458d05f217a2e9f07a44bd221303d821a790382a82cff0b912d48f63" Mar 21 03:51:16 crc kubenswrapper[4685]: I0321 03:51:16.117739 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60c96edd458d05f217a2e9f07a44bd221303d821a790382a82cff0b912d48f63"} err="failed to get container status \"60c96edd458d05f217a2e9f07a44bd221303d821a790382a82cff0b912d48f63\": rpc error: code = NotFound desc = could not find container \"60c96edd458d05f217a2e9f07a44bd221303d821a790382a82cff0b912d48f63\": container with ID starting with 60c96edd458d05f217a2e9f07a44bd221303d821a790382a82cff0b912d48f63 not found: ID does not exist" Mar 21 03:51:16 crc kubenswrapper[4685]: I0321 03:51:16.117751 4685 scope.go:117] "RemoveContainer" containerID="a8583e9b5dff82d5df52e281ba4069e9259b1c8fe3d1b8121d0e9f3f9e97d47b" Mar 21 03:51:16 crc kubenswrapper[4685]: E0321 03:51:16.118051 4685 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8583e9b5dff82d5df52e281ba4069e9259b1c8fe3d1b8121d0e9f3f9e97d47b\": container with ID starting with a8583e9b5dff82d5df52e281ba4069e9259b1c8fe3d1b8121d0e9f3f9e97d47b not found: ID does not exist" containerID="a8583e9b5dff82d5df52e281ba4069e9259b1c8fe3d1b8121d0e9f3f9e97d47b" Mar 21 03:51:16 crc kubenswrapper[4685]: I0321 03:51:16.118094 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8583e9b5dff82d5df52e281ba4069e9259b1c8fe3d1b8121d0e9f3f9e97d47b"} err="failed to get container status \"a8583e9b5dff82d5df52e281ba4069e9259b1c8fe3d1b8121d0e9f3f9e97d47b\": rpc error: code = NotFound desc = could not find container \"a8583e9b5dff82d5df52e281ba4069e9259b1c8fe3d1b8121d0e9f3f9e97d47b\": container with ID starting with a8583e9b5dff82d5df52e281ba4069e9259b1c8fe3d1b8121d0e9f3f9e97d47b not found: ID does not exist" Mar 21 03:51:16 crc kubenswrapper[4685]: I0321 03:51:16.118131 4685 scope.go:117] "RemoveContainer" containerID="ba587c4fe2f05966282b50ba5236b9f3d9ef6de63f72c70ae9f7a5222cb8b904" Mar 21 03:51:16 crc kubenswrapper[4685]: E0321 03:51:16.118387 4685 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba587c4fe2f05966282b50ba5236b9f3d9ef6de63f72c70ae9f7a5222cb8b904\": container with ID starting with ba587c4fe2f05966282b50ba5236b9f3d9ef6de63f72c70ae9f7a5222cb8b904 not found: ID does not exist" containerID="ba587c4fe2f05966282b50ba5236b9f3d9ef6de63f72c70ae9f7a5222cb8b904" Mar 21 03:51:16 crc kubenswrapper[4685]: I0321 03:51:16.118403 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba587c4fe2f05966282b50ba5236b9f3d9ef6de63f72c70ae9f7a5222cb8b904"} err="failed to get container status \"ba587c4fe2f05966282b50ba5236b9f3d9ef6de63f72c70ae9f7a5222cb8b904\": rpc error: code = NotFound desc = could not find container \"ba587c4fe2f05966282b50ba5236b9f3d9ef6de63f72c70ae9f7a5222cb8b904\": container with ID starting with ba587c4fe2f05966282b50ba5236b9f3d9ef6de63f72c70ae9f7a5222cb8b904 not found: ID does not exist" Mar 21 03:51:16 crc kubenswrapper[4685]: I0321 03:51:16.118416 4685 scope.go:117] "RemoveContainer" containerID="52df5756c4d3e8d1087a5a38b3e2b9966fb492d666c43773aa12a4cd83f6158d" Mar 21 03:51:16 crc kubenswrapper[4685]: E0321 03:51:16.118636 4685 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52df5756c4d3e8d1087a5a38b3e2b9966fb492d666c43773aa12a4cd83f6158d\": container with ID starting with 52df5756c4d3e8d1087a5a38b3e2b9966fb492d666c43773aa12a4cd83f6158d not found: ID does not exist" containerID="52df5756c4d3e8d1087a5a38b3e2b9966fb492d666c43773aa12a4cd83f6158d" Mar 21 03:51:16 crc kubenswrapper[4685]: I0321 03:51:16.118671 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52df5756c4d3e8d1087a5a38b3e2b9966fb492d666c43773aa12a4cd83f6158d"} err="failed to get container status \"52df5756c4d3e8d1087a5a38b3e2b9966fb492d666c43773aa12a4cd83f6158d\": rpc error: code = NotFound desc = could not find container \"52df5756c4d3e8d1087a5a38b3e2b9966fb492d666c43773aa12a4cd83f6158d\": container with ID starting with 52df5756c4d3e8d1087a5a38b3e2b9966fb492d666c43773aa12a4cd83f6158d not found: ID does not exist" Mar 21 03:51:16 crc kubenswrapper[4685]: I0321 03:51:16.315187 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 21 03:51:18 crc kubenswrapper[4685]: E0321 03:51:18.090990 4685 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.158:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189ebec90074fa75 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 03:51:13.735924341 +0000 UTC m=+306.212993173,LastTimestamp:2026-03-21 03:51:13.735924341 +0000 UTC m=+306.212993173,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 03:51:18 crc kubenswrapper[4685]: I0321 03:51:18.309529 4685 status_manager.go:851] "Failed to get status for pod" podUID="de82f429-1aa8-465f-b944-2bfb17d7d26b" pod="openshift-route-controller-manager/route-controller-manager-5bc69c9d55-7trxv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-5bc69c9d55-7trxv\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 21 03:51:18 crc kubenswrapper[4685]: I0321 03:51:18.310629 4685 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 21 03:51:18 crc kubenswrapper[4685]: I0321 03:51:18.311378 4685 status_manager.go:851] "Failed to get status for pod" podUID="ede8a25b-fb4b-4cbc-914d-1fc24155b8f1" pod="openshift-controller-manager/controller-manager-75994f449d-jj86b" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-75994f449d-jj86b\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 21 03:51:18 crc kubenswrapper[4685]: I0321 03:51:18.312091 4685 status_manager.go:851] "Failed to get status for pod" podUID="7b92b45a-4c33-4463-96c1-0d4227d1d118" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 21 03:51:18 crc kubenswrapper[4685]: E0321 03:51:18.355122 4685 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.158:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" volumeName="registry-storage" Mar 21 03:51:22 crc kubenswrapper[4685]: E0321 03:51:22.502687 4685 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 21 03:51:22 crc kubenswrapper[4685]: E0321 03:51:22.504078 4685 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 21 03:51:22 crc kubenswrapper[4685]: E0321 03:51:22.504631 4685 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 21 03:51:22 crc kubenswrapper[4685]: E0321 03:51:22.505234 4685 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 21 03:51:22 crc kubenswrapper[4685]: E0321 03:51:22.505607 4685 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 21 03:51:22 crc kubenswrapper[4685]: I0321 03:51:22.505660 4685 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 21 03:51:22 crc kubenswrapper[4685]: E0321 03:51:22.506106 4685 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" interval="200ms" Mar 21 03:51:22 crc kubenswrapper[4685]: E0321 03:51:22.707036 4685 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" interval="400ms" Mar 21 03:51:23 crc kubenswrapper[4685]: E0321 03:51:23.108207 4685 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" interval="800ms" Mar 21 03:51:23 crc kubenswrapper[4685]: E0321 03:51:23.909306 4685 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" interval="1.6s" Mar 21 03:51:25 crc kubenswrapper[4685]: E0321 03:51:25.510292 4685 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" interval="3.2s" Mar 21 03:51:26 crc kubenswrapper[4685]: I0321 03:51:26.300138 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 03:51:26 crc kubenswrapper[4685]: I0321 03:51:26.301437 4685 status_manager.go:851] "Failed to get status for pod" podUID="7b92b45a-4c33-4463-96c1-0d4227d1d118" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 21 03:51:26 crc kubenswrapper[4685]: I0321 03:51:26.302125 4685 status_manager.go:851] "Failed to get status for pod" podUID="de82f429-1aa8-465f-b944-2bfb17d7d26b" pod="openshift-route-controller-manager/route-controller-manager-5bc69c9d55-7trxv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-5bc69c9d55-7trxv\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 21 03:51:26 crc kubenswrapper[4685]: I0321 03:51:26.302450 4685 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 21 03:51:26 crc kubenswrapper[4685]: I0321 03:51:26.302670 4685 status_manager.go:851] "Failed to get status for pod" podUID="ede8a25b-fb4b-4cbc-914d-1fc24155b8f1" pod="openshift-controller-manager/controller-manager-75994f449d-jj86b" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-75994f449d-jj86b\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 21 03:51:26 crc kubenswrapper[4685]: I0321 03:51:26.329061 4685 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="7fd9d618-b4ed-4942-b915-76dc59fb834a" Mar 21 03:51:26 crc kubenswrapper[4685]: I0321 03:51:26.329353 4685 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="7fd9d618-b4ed-4942-b915-76dc59fb834a" Mar 21 03:51:26 crc kubenswrapper[4685]: E0321 03:51:26.330211 4685 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 03:51:26 crc kubenswrapper[4685]: I0321 03:51:26.330868 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 03:51:27 crc kubenswrapper[4685]: I0321 03:51:27.049198 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 21 03:51:27 crc kubenswrapper[4685]: I0321 03:51:27.051071 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 21 03:51:27 crc kubenswrapper[4685]: I0321 03:51:27.051129 4685 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="aa74801f196e57343a98011d671414abf1f1ebf4d7b962be522f0b6cad777acf" exitCode=1 Mar 21 03:51:27 crc kubenswrapper[4685]: I0321 03:51:27.051186 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"aa74801f196e57343a98011d671414abf1f1ebf4d7b962be522f0b6cad777acf"} Mar 21 03:51:27 crc kubenswrapper[4685]: I0321 03:51:27.052191 4685 scope.go:117] "RemoveContainer" containerID="aa74801f196e57343a98011d671414abf1f1ebf4d7b962be522f0b6cad777acf" Mar 21 03:51:27 crc kubenswrapper[4685]: I0321 03:51:27.052489 4685 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 21 03:51:27 crc kubenswrapper[4685]: I0321 03:51:27.052868 4685 status_manager.go:851] "Failed to get status for pod" podUID="de82f429-1aa8-465f-b944-2bfb17d7d26b" pod="openshift-route-controller-manager/route-controller-manager-5bc69c9d55-7trxv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-5bc69c9d55-7trxv\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 21 03:51:27 crc kubenswrapper[4685]: I0321 03:51:27.053234 4685 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="10f0fc42791ef2b5ef1ca74eabc0254c47489b5ea4353f816c50cc73173902ed" exitCode=0 Mar 21 03:51:27 crc kubenswrapper[4685]: I0321 03:51:27.053287 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"10f0fc42791ef2b5ef1ca74eabc0254c47489b5ea4353f816c50cc73173902ed"} Mar 21 03:51:27 crc kubenswrapper[4685]: I0321 03:51:27.053395 4685 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 21 03:51:27 crc kubenswrapper[4685]: I0321 03:51:27.053401 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"68e3e136512b9cc2c427aca1efe125bc20b93706541acf7ce535eefb56935ba4"} Mar 21 03:51:27 crc kubenswrapper[4685]: I0321 03:51:27.053628 4685 status_manager.go:851] "Failed to get status for pod" podUID="ede8a25b-fb4b-4cbc-914d-1fc24155b8f1" pod="openshift-controller-manager/controller-manager-75994f449d-jj86b" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-75994f449d-jj86b\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 21 03:51:27 crc kubenswrapper[4685]: I0321 03:51:27.053956 4685 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="7fd9d618-b4ed-4942-b915-76dc59fb834a" Mar 21 03:51:27 crc kubenswrapper[4685]: I0321 03:51:27.054002 4685 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="7fd9d618-b4ed-4942-b915-76dc59fb834a" Mar 21 03:51:27 crc kubenswrapper[4685]: I0321 03:51:27.054619 4685 status_manager.go:851] "Failed to get status for pod" podUID="7b92b45a-4c33-4463-96c1-0d4227d1d118" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 21 03:51:27 crc kubenswrapper[4685]: E0321 03:51:27.054697 4685 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 03:51:27 crc kubenswrapper[4685]: I0321 03:51:27.055110 4685 status_manager.go:851] "Failed to get status for pod" podUID="de82f429-1aa8-465f-b944-2bfb17d7d26b" pod="openshift-route-controller-manager/route-controller-manager-5bc69c9d55-7trxv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-5bc69c9d55-7trxv\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 21 03:51:27 crc kubenswrapper[4685]: I0321 03:51:27.055813 4685 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 21 03:51:27 crc kubenswrapper[4685]: I0321 03:51:27.056213 4685 status_manager.go:851] "Failed to get status for pod" podUID="ede8a25b-fb4b-4cbc-914d-1fc24155b8f1" pod="openshift-controller-manager/controller-manager-75994f449d-jj86b" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-75994f449d-jj86b\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 21 03:51:27 crc kubenswrapper[4685]: I0321 03:51:27.056680 4685 status_manager.go:851] "Failed to get status for pod" podUID="7b92b45a-4c33-4463-96c1-0d4227d1d118" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 21 03:51:27 crc kubenswrapper[4685]: I0321 03:51:27.056993 4685 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 21 03:51:28 crc kubenswrapper[4685]: I0321 03:51:28.073664 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 21 03:51:28 crc kubenswrapper[4685]: I0321 03:51:28.074329 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 21 03:51:28 crc kubenswrapper[4685]: I0321 03:51:28.074384 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"97b1c4b4aaa767b415ec30c75364ddc636da9df24b0e2c6d093aeae565bf5f56"} Mar 21 03:51:28 crc kubenswrapper[4685]: I0321 03:51:28.078006 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"63f901333d3b41ee6dabc3c960d94078759b181585fcf426baaee15c2c2171c6"} Mar 21 03:51:28 crc kubenswrapper[4685]: I0321 03:51:28.078031 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"e25cdd73051310c72ae9c4811fffa1bea4bad22cd4e32dc65ca2eaf385e52ec2"} Mar 21 03:51:28 crc kubenswrapper[4685]: I0321 03:51:28.078041 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"767d3b65fed7adca198067d12de47411774e62bc70bfcd225aee50e23cd9eef0"} Mar 21 03:51:28 crc kubenswrapper[4685]: I0321 03:51:28.078050 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"f87429d4c9260fda7708d72b8b789a9dc466e934b3eea530486a12cdc9fa1525"} Mar 21 03:51:29 crc kubenswrapper[4685]: I0321 03:51:29.089805 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"f84dc4e0e0bb49de918f17b5b4d3542b5a69ec79e76b76cb97e53cb3ca2051cd"} Mar 21 03:51:29 crc kubenswrapper[4685]: I0321 03:51:29.090050 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 03:51:29 crc kubenswrapper[4685]: I0321 03:51:29.090257 4685 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="7fd9d618-b4ed-4942-b915-76dc59fb834a" Mar 21 03:51:29 crc kubenswrapper[4685]: I0321 03:51:29.090293 4685 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="7fd9d618-b4ed-4942-b915-76dc59fb834a" Mar 21 03:51:30 crc kubenswrapper[4685]: I0321 03:51:30.549189 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-m7q8h" podUID="ce8529fe-4546-466d-bb07-3ee73cf1bc1f" containerName="oauth-openshift" containerID="cri-o://6ac1c261b76689f8fe7cc196075da5de0b5071a522baff19ac6f7038191a2302" gracePeriod=15 Mar 21 03:51:30 crc kubenswrapper[4685]: I0321 03:51:30.952869 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-m7q8h" Mar 21 03:51:31 crc kubenswrapper[4685]: I0321 03:51:31.099067 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ce8529fe-4546-466d-bb07-3ee73cf1bc1f-v4-0-config-system-router-certs\") pod \"ce8529fe-4546-466d-bb07-3ee73cf1bc1f\" (UID: \"ce8529fe-4546-466d-bb07-3ee73cf1bc1f\") " Mar 21 03:51:31 crc kubenswrapper[4685]: I0321 03:51:31.099161 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ce8529fe-4546-466d-bb07-3ee73cf1bc1f-v4-0-config-system-session\") pod \"ce8529fe-4546-466d-bb07-3ee73cf1bc1f\" (UID: \"ce8529fe-4546-466d-bb07-3ee73cf1bc1f\") " Mar 21 03:51:31 crc kubenswrapper[4685]: I0321 03:51:31.099198 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ce8529fe-4546-466d-bb07-3ee73cf1bc1f-v4-0-config-user-template-error\") pod \"ce8529fe-4546-466d-bb07-3ee73cf1bc1f\" (UID: \"ce8529fe-4546-466d-bb07-3ee73cf1bc1f\") " Mar 21 03:51:31 crc kubenswrapper[4685]: I0321 03:51:31.099235 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hf5rg\" (UniqueName: \"kubernetes.io/projected/ce8529fe-4546-466d-bb07-3ee73cf1bc1f-kube-api-access-hf5rg\") pod \"ce8529fe-4546-466d-bb07-3ee73cf1bc1f\" (UID: \"ce8529fe-4546-466d-bb07-3ee73cf1bc1f\") " Mar 21 03:51:31 crc kubenswrapper[4685]: I0321 03:51:31.099298 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ce8529fe-4546-466d-bb07-3ee73cf1bc1f-v4-0-config-user-template-provider-selection\") pod \"ce8529fe-4546-466d-bb07-3ee73cf1bc1f\" (UID: \"ce8529fe-4546-466d-bb07-3ee73cf1bc1f\") " Mar 21 03:51:31 crc kubenswrapper[4685]: I0321 03:51:31.099360 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ce8529fe-4546-466d-bb07-3ee73cf1bc1f-v4-0-config-user-idp-0-file-data\") pod \"ce8529fe-4546-466d-bb07-3ee73cf1bc1f\" (UID: \"ce8529fe-4546-466d-bb07-3ee73cf1bc1f\") " Mar 21 03:51:31 crc kubenswrapper[4685]: I0321 03:51:31.099407 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ce8529fe-4546-466d-bb07-3ee73cf1bc1f-v4-0-config-user-template-login\") pod \"ce8529fe-4546-466d-bb07-3ee73cf1bc1f\" (UID: \"ce8529fe-4546-466d-bb07-3ee73cf1bc1f\") " Mar 21 03:51:31 crc kubenswrapper[4685]: I0321 03:51:31.099447 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ce8529fe-4546-466d-bb07-3ee73cf1bc1f-v4-0-config-system-trusted-ca-bundle\") pod \"ce8529fe-4546-466d-bb07-3ee73cf1bc1f\" (UID: \"ce8529fe-4546-466d-bb07-3ee73cf1bc1f\") " Mar 21 03:51:31 crc kubenswrapper[4685]: I0321 03:51:31.099484 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ce8529fe-4546-466d-bb07-3ee73cf1bc1f-v4-0-config-system-service-ca\") pod \"ce8529fe-4546-466d-bb07-3ee73cf1bc1f\" (UID: \"ce8529fe-4546-466d-bb07-3ee73cf1bc1f\") " Mar 21 03:51:31 crc kubenswrapper[4685]: I0321 03:51:31.099543 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ce8529fe-4546-466d-bb07-3ee73cf1bc1f-v4-0-config-system-ocp-branding-template\") pod \"ce8529fe-4546-466d-bb07-3ee73cf1bc1f\" (UID: \"ce8529fe-4546-466d-bb07-3ee73cf1bc1f\") " Mar 21 03:51:31 crc kubenswrapper[4685]: I0321 03:51:31.099598 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ce8529fe-4546-466d-bb07-3ee73cf1bc1f-v4-0-config-system-serving-cert\") pod \"ce8529fe-4546-466d-bb07-3ee73cf1bc1f\" (UID: \"ce8529fe-4546-466d-bb07-3ee73cf1bc1f\") " Mar 21 03:51:31 crc kubenswrapper[4685]: I0321 03:51:31.099644 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ce8529fe-4546-466d-bb07-3ee73cf1bc1f-v4-0-config-system-cliconfig\") pod \"ce8529fe-4546-466d-bb07-3ee73cf1bc1f\" (UID: \"ce8529fe-4546-466d-bb07-3ee73cf1bc1f\") " Mar 21 03:51:31 crc kubenswrapper[4685]: I0321 03:51:31.099677 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ce8529fe-4546-466d-bb07-3ee73cf1bc1f-audit-dir\") pod \"ce8529fe-4546-466d-bb07-3ee73cf1bc1f\" (UID: \"ce8529fe-4546-466d-bb07-3ee73cf1bc1f\") " Mar 21 03:51:31 crc kubenswrapper[4685]: I0321 03:51:31.099714 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ce8529fe-4546-466d-bb07-3ee73cf1bc1f-audit-policies\") pod \"ce8529fe-4546-466d-bb07-3ee73cf1bc1f\" (UID: \"ce8529fe-4546-466d-bb07-3ee73cf1bc1f\") " Mar 21 03:51:31 crc kubenswrapper[4685]: I0321 03:51:31.100797 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce8529fe-4546-466d-bb07-3ee73cf1bc1f-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "ce8529fe-4546-466d-bb07-3ee73cf1bc1f" (UID: "ce8529fe-4546-466d-bb07-3ee73cf1bc1f"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:51:31 crc kubenswrapper[4685]: I0321 03:51:31.101592 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce8529fe-4546-466d-bb07-3ee73cf1bc1f-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "ce8529fe-4546-466d-bb07-3ee73cf1bc1f" (UID: "ce8529fe-4546-466d-bb07-3ee73cf1bc1f"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:51:31 crc kubenswrapper[4685]: I0321 03:51:31.101665 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ce8529fe-4546-466d-bb07-3ee73cf1bc1f-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "ce8529fe-4546-466d-bb07-3ee73cf1bc1f" (UID: "ce8529fe-4546-466d-bb07-3ee73cf1bc1f"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 03:51:31 crc kubenswrapper[4685]: I0321 03:51:31.102271 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce8529fe-4546-466d-bb07-3ee73cf1bc1f-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "ce8529fe-4546-466d-bb07-3ee73cf1bc1f" (UID: "ce8529fe-4546-466d-bb07-3ee73cf1bc1f"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:51:31 crc kubenswrapper[4685]: I0321 03:51:31.102335 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce8529fe-4546-466d-bb07-3ee73cf1bc1f-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "ce8529fe-4546-466d-bb07-3ee73cf1bc1f" (UID: "ce8529fe-4546-466d-bb07-3ee73cf1bc1f"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:51:31 crc kubenswrapper[4685]: I0321 03:51:31.107961 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce8529fe-4546-466d-bb07-3ee73cf1bc1f-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "ce8529fe-4546-466d-bb07-3ee73cf1bc1f" (UID: "ce8529fe-4546-466d-bb07-3ee73cf1bc1f"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 03:51:31 crc kubenswrapper[4685]: I0321 03:51:31.108993 4685 generic.go:334] "Generic (PLEG): container finished" podID="ce8529fe-4546-466d-bb07-3ee73cf1bc1f" containerID="6ac1c261b76689f8fe7cc196075da5de0b5071a522baff19ac6f7038191a2302" exitCode=0 Mar 21 03:51:31 crc kubenswrapper[4685]: I0321 03:51:31.109069 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-m7q8h" event={"ID":"ce8529fe-4546-466d-bb07-3ee73cf1bc1f","Type":"ContainerDied","Data":"6ac1c261b76689f8fe7cc196075da5de0b5071a522baff19ac6f7038191a2302"} Mar 21 03:51:31 crc kubenswrapper[4685]: I0321 03:51:31.109110 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-m7q8h" event={"ID":"ce8529fe-4546-466d-bb07-3ee73cf1bc1f","Type":"ContainerDied","Data":"e24c4fa3a42c700e29624cf05630b1d7f1435c08e4277b11c630c72f423b271a"} Mar 21 03:51:31 crc kubenswrapper[4685]: I0321 03:51:31.109138 4685 scope.go:117] "RemoveContainer" containerID="6ac1c261b76689f8fe7cc196075da5de0b5071a522baff19ac6f7038191a2302" Mar 21 03:51:31 crc kubenswrapper[4685]: I0321 03:51:31.109390 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-m7q8h" Mar 21 03:51:31 crc kubenswrapper[4685]: I0321 03:51:31.110535 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce8529fe-4546-466d-bb07-3ee73cf1bc1f-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "ce8529fe-4546-466d-bb07-3ee73cf1bc1f" (UID: "ce8529fe-4546-466d-bb07-3ee73cf1bc1f"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 03:51:31 crc kubenswrapper[4685]: I0321 03:51:31.112080 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce8529fe-4546-466d-bb07-3ee73cf1bc1f-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "ce8529fe-4546-466d-bb07-3ee73cf1bc1f" (UID: "ce8529fe-4546-466d-bb07-3ee73cf1bc1f"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 03:51:31 crc kubenswrapper[4685]: I0321 03:51:31.112763 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce8529fe-4546-466d-bb07-3ee73cf1bc1f-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "ce8529fe-4546-466d-bb07-3ee73cf1bc1f" (UID: "ce8529fe-4546-466d-bb07-3ee73cf1bc1f"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 03:51:31 crc kubenswrapper[4685]: I0321 03:51:31.112950 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce8529fe-4546-466d-bb07-3ee73cf1bc1f-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "ce8529fe-4546-466d-bb07-3ee73cf1bc1f" (UID: "ce8529fe-4546-466d-bb07-3ee73cf1bc1f"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 03:51:31 crc kubenswrapper[4685]: I0321 03:51:31.115921 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce8529fe-4546-466d-bb07-3ee73cf1bc1f-kube-api-access-hf5rg" (OuterVolumeSpecName: "kube-api-access-hf5rg") pod "ce8529fe-4546-466d-bb07-3ee73cf1bc1f" (UID: "ce8529fe-4546-466d-bb07-3ee73cf1bc1f"). InnerVolumeSpecName "kube-api-access-hf5rg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:51:31 crc kubenswrapper[4685]: I0321 03:51:31.118672 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce8529fe-4546-466d-bb07-3ee73cf1bc1f-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "ce8529fe-4546-466d-bb07-3ee73cf1bc1f" (UID: "ce8529fe-4546-466d-bb07-3ee73cf1bc1f"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 03:51:31 crc kubenswrapper[4685]: I0321 03:51:31.119256 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce8529fe-4546-466d-bb07-3ee73cf1bc1f-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "ce8529fe-4546-466d-bb07-3ee73cf1bc1f" (UID: "ce8529fe-4546-466d-bb07-3ee73cf1bc1f"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 03:51:31 crc kubenswrapper[4685]: I0321 03:51:31.122307 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce8529fe-4546-466d-bb07-3ee73cf1bc1f-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "ce8529fe-4546-466d-bb07-3ee73cf1bc1f" (UID: "ce8529fe-4546-466d-bb07-3ee73cf1bc1f"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 03:51:31 crc kubenswrapper[4685]: I0321 03:51:31.193396 4685 scope.go:117] "RemoveContainer" containerID="6ac1c261b76689f8fe7cc196075da5de0b5071a522baff19ac6f7038191a2302" Mar 21 03:51:31 crc kubenswrapper[4685]: E0321 03:51:31.194239 4685 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ac1c261b76689f8fe7cc196075da5de0b5071a522baff19ac6f7038191a2302\": container with ID starting with 6ac1c261b76689f8fe7cc196075da5de0b5071a522baff19ac6f7038191a2302 not found: ID does not exist" containerID="6ac1c261b76689f8fe7cc196075da5de0b5071a522baff19ac6f7038191a2302" Mar 21 03:51:31 crc kubenswrapper[4685]: I0321 03:51:31.194358 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ac1c261b76689f8fe7cc196075da5de0b5071a522baff19ac6f7038191a2302"} err="failed to get container status \"6ac1c261b76689f8fe7cc196075da5de0b5071a522baff19ac6f7038191a2302\": rpc error: code = NotFound desc = could not find container \"6ac1c261b76689f8fe7cc196075da5de0b5071a522baff19ac6f7038191a2302\": container with ID starting with 6ac1c261b76689f8fe7cc196075da5de0b5071a522baff19ac6f7038191a2302 not found: ID does not exist" Mar 21 03:51:31 crc kubenswrapper[4685]: I0321 03:51:31.201041 4685 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ce8529fe-4546-466d-bb07-3ee73cf1bc1f-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 03:51:31 crc kubenswrapper[4685]: I0321 03:51:31.201083 4685 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ce8529fe-4546-466d-bb07-3ee73cf1bc1f-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 21 03:51:31 crc kubenswrapper[4685]: I0321 03:51:31.201095 4685 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ce8529fe-4546-466d-bb07-3ee73cf1bc1f-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 21 03:51:31 crc kubenswrapper[4685]: I0321 03:51:31.201106 4685 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ce8529fe-4546-466d-bb07-3ee73cf1bc1f-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 21 03:51:31 crc kubenswrapper[4685]: I0321 03:51:31.201117 4685 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ce8529fe-4546-466d-bb07-3ee73cf1bc1f-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 21 03:51:31 crc kubenswrapper[4685]: I0321 03:51:31.201124 4685 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ce8529fe-4546-466d-bb07-3ee73cf1bc1f-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 21 03:51:31 crc kubenswrapper[4685]: I0321 03:51:31.201133 4685 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ce8529fe-4546-466d-bb07-3ee73cf1bc1f-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 21 03:51:31 crc kubenswrapper[4685]: I0321 03:51:31.201141 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hf5rg\" (UniqueName: \"kubernetes.io/projected/ce8529fe-4546-466d-bb07-3ee73cf1bc1f-kube-api-access-hf5rg\") on node \"crc\" DevicePath \"\"" Mar 21 03:51:31 crc kubenswrapper[4685]: I0321 03:51:31.201151 4685 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ce8529fe-4546-466d-bb07-3ee73cf1bc1f-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 21 03:51:31 crc kubenswrapper[4685]: I0321 03:51:31.201160 4685 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ce8529fe-4546-466d-bb07-3ee73cf1bc1f-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 21 03:51:31 crc kubenswrapper[4685]: I0321 03:51:31.201168 4685 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ce8529fe-4546-466d-bb07-3ee73cf1bc1f-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 21 03:51:31 crc kubenswrapper[4685]: I0321 03:51:31.201177 4685 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ce8529fe-4546-466d-bb07-3ee73cf1bc1f-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 03:51:31 crc kubenswrapper[4685]: I0321 03:51:31.201185 4685 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ce8529fe-4546-466d-bb07-3ee73cf1bc1f-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 21 03:51:31 crc kubenswrapper[4685]: I0321 03:51:31.201200 4685 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ce8529fe-4546-466d-bb07-3ee73cf1bc1f-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 21 03:51:31 crc kubenswrapper[4685]: I0321 03:51:31.331702 4685 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 03:51:31 crc kubenswrapper[4685]: I0321 03:51:31.331799 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 03:51:31 crc kubenswrapper[4685]: I0321 03:51:31.342322 4685 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 03:51:33 crc kubenswrapper[4685]: I0321 03:51:33.799229 4685 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 03:51:33 crc kubenswrapper[4685]: I0321 03:51:33.799463 4685 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 21 03:51:33 crc kubenswrapper[4685]: I0321 03:51:33.800539 4685 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 21 03:51:34 crc kubenswrapper[4685]: I0321 03:51:34.104280 4685 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 03:51:34 crc kubenswrapper[4685]: I0321 03:51:34.867651 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 03:51:35 crc kubenswrapper[4685]: I0321 03:51:35.136202 4685 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="7fd9d618-b4ed-4942-b915-76dc59fb834a" Mar 21 03:51:35 crc kubenswrapper[4685]: I0321 03:51:35.136245 4685 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="7fd9d618-b4ed-4942-b915-76dc59fb834a" Mar 21 03:51:35 crc kubenswrapper[4685]: I0321 03:51:35.140367 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 03:51:35 crc kubenswrapper[4685]: I0321 03:51:35.143339 4685 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="4d9431f8-b10e-4cfc-9e18-c4a04c56b94f" Mar 21 03:51:36 crc kubenswrapper[4685]: I0321 03:51:36.141205 4685 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="7fd9d618-b4ed-4942-b915-76dc59fb834a" Mar 21 03:51:36 crc kubenswrapper[4685]: I0321 03:51:36.141602 4685 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="7fd9d618-b4ed-4942-b915-76dc59fb834a" Mar 21 03:51:38 crc kubenswrapper[4685]: I0321 03:51:38.319384 4685 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="4d9431f8-b10e-4cfc-9e18-c4a04c56b94f" Mar 21 03:51:43 crc kubenswrapper[4685]: I0321 03:51:43.806563 4685 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 03:51:43 crc kubenswrapper[4685]: I0321 03:51:43.815494 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 03:51:45 crc kubenswrapper[4685]: I0321 03:51:45.109217 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 21 03:51:45 crc kubenswrapper[4685]: I0321 03:51:45.295203 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 21 03:51:45 crc kubenswrapper[4685]: I0321 03:51:45.916982 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 21 03:51:46 crc kubenswrapper[4685]: I0321 03:51:46.190414 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 21 03:51:46 crc kubenswrapper[4685]: I0321 03:51:46.494344 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 21 03:51:46 crc kubenswrapper[4685]: I0321 03:51:46.692827 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 21 03:51:46 crc kubenswrapper[4685]: I0321 03:51:46.787560 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 21 03:51:46 crc kubenswrapper[4685]: I0321 03:51:46.806053 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 21 03:51:47 crc kubenswrapper[4685]: I0321 03:51:47.022023 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 21 03:51:47 crc kubenswrapper[4685]: I0321 03:51:47.030092 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 21 03:51:47 crc kubenswrapper[4685]: I0321 03:51:47.168705 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 21 03:51:47 crc kubenswrapper[4685]: I0321 03:51:47.265563 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 21 03:51:47 crc kubenswrapper[4685]: I0321 03:51:47.306913 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 21 03:51:47 crc kubenswrapper[4685]: I0321 03:51:47.406532 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 21 03:51:47 crc kubenswrapper[4685]: I0321 03:51:47.596880 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 21 03:51:47 crc kubenswrapper[4685]: I0321 03:51:47.641476 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 21 03:51:47 crc kubenswrapper[4685]: I0321 03:51:47.929726 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 21 03:51:48 crc kubenswrapper[4685]: I0321 03:51:48.007728 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 21 03:51:48 crc kubenswrapper[4685]: I0321 03:51:48.236955 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 21 03:51:48 crc kubenswrapper[4685]: I0321 03:51:48.240011 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 21 03:51:48 crc kubenswrapper[4685]: I0321 03:51:48.246509 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 21 03:51:48 crc kubenswrapper[4685]: I0321 03:51:48.279439 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 21 03:51:48 crc kubenswrapper[4685]: I0321 03:51:48.306822 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 21 03:51:48 crc kubenswrapper[4685]: I0321 03:51:48.306972 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 21 03:51:48 crc kubenswrapper[4685]: I0321 03:51:48.589763 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 21 03:51:49 crc kubenswrapper[4685]: I0321 03:51:49.091816 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 21 03:51:49 crc kubenswrapper[4685]: I0321 03:51:49.120658 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 21 03:51:49 crc kubenswrapper[4685]: I0321 03:51:49.156752 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 21 03:51:49 crc kubenswrapper[4685]: I0321 03:51:49.298708 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 21 03:51:49 crc kubenswrapper[4685]: I0321 03:51:49.344025 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 21 03:51:49 crc kubenswrapper[4685]: I0321 03:51:49.429771 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 21 03:51:49 crc kubenswrapper[4685]: I0321 03:51:49.471121 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 21 03:51:49 crc kubenswrapper[4685]: I0321 03:51:49.535681 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 21 03:51:49 crc kubenswrapper[4685]: I0321 03:51:49.697066 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 21 03:51:49 crc kubenswrapper[4685]: I0321 03:51:49.802800 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 21 03:51:49 crc kubenswrapper[4685]: I0321 03:51:49.996373 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 21 03:51:50 crc kubenswrapper[4685]: I0321 03:51:50.049076 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 21 03:51:50 crc kubenswrapper[4685]: I0321 03:51:50.062395 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 21 03:51:50 crc kubenswrapper[4685]: I0321 03:51:50.075748 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 21 03:51:50 crc kubenswrapper[4685]: I0321 03:51:50.213851 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 21 03:51:50 crc kubenswrapper[4685]: I0321 03:51:50.234875 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 21 03:51:50 crc kubenswrapper[4685]: I0321 03:51:50.246096 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 21 03:51:50 crc kubenswrapper[4685]: I0321 03:51:50.279578 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 21 03:51:50 crc kubenswrapper[4685]: I0321 03:51:50.339006 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 21 03:51:50 crc kubenswrapper[4685]: I0321 03:51:50.342341 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 21 03:51:50 crc kubenswrapper[4685]: I0321 03:51:50.355257 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 21 03:51:50 crc kubenswrapper[4685]: I0321 03:51:50.390805 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 21 03:51:50 crc kubenswrapper[4685]: I0321 03:51:50.406829 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 21 03:51:50 crc kubenswrapper[4685]: I0321 03:51:50.438555 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 21 03:51:50 crc kubenswrapper[4685]: I0321 03:51:50.452901 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 21 03:51:50 crc kubenswrapper[4685]: I0321 03:51:50.457683 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 21 03:51:50 crc kubenswrapper[4685]: I0321 03:51:50.476620 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 21 03:51:50 crc kubenswrapper[4685]: I0321 03:51:50.498194 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 21 03:51:50 crc kubenswrapper[4685]: I0321 03:51:50.498820 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 21 03:51:50 crc kubenswrapper[4685]: I0321 03:51:50.526184 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 21 03:51:50 crc kubenswrapper[4685]: I0321 03:51:50.636643 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 21 03:51:50 crc kubenswrapper[4685]: I0321 03:51:50.700293 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 21 03:51:50 crc kubenswrapper[4685]: I0321 03:51:50.767055 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 21 03:51:50 crc kubenswrapper[4685]: I0321 03:51:50.804725 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 21 03:51:50 crc kubenswrapper[4685]: I0321 03:51:50.821258 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 21 03:51:50 crc kubenswrapper[4685]: I0321 03:51:50.824951 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 21 03:51:50 crc kubenswrapper[4685]: I0321 03:51:50.869376 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 21 03:51:50 crc kubenswrapper[4685]: I0321 03:51:50.871860 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 21 03:51:50 crc kubenswrapper[4685]: I0321 03:51:50.950397 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 21 03:51:50 crc kubenswrapper[4685]: I0321 03:51:50.963251 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 21 03:51:50 crc kubenswrapper[4685]: I0321 03:51:50.997893 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 21 03:51:51 crc kubenswrapper[4685]: I0321 03:51:51.000292 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 21 03:51:51 crc kubenswrapper[4685]: I0321 03:51:51.001399 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 21 03:51:51 crc kubenswrapper[4685]: I0321 03:51:51.017386 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 21 03:51:51 crc kubenswrapper[4685]: I0321 03:51:51.024831 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 21 03:51:51 crc kubenswrapper[4685]: I0321 03:51:51.039420 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 21 03:51:51 crc kubenswrapper[4685]: I0321 03:51:51.040563 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 21 03:51:51 crc kubenswrapper[4685]: I0321 03:51:51.142205 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 21 03:51:51 crc kubenswrapper[4685]: I0321 03:51:51.148061 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 21 03:51:51 crc kubenswrapper[4685]: I0321 03:51:51.157029 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 21 03:51:51 crc kubenswrapper[4685]: I0321 03:51:51.214974 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 21 03:51:51 crc kubenswrapper[4685]: I0321 03:51:51.248028 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 21 03:51:51 crc kubenswrapper[4685]: I0321 03:51:51.313901 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 21 03:51:51 crc kubenswrapper[4685]: I0321 03:51:51.315360 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 21 03:51:51 crc kubenswrapper[4685]: I0321 03:51:51.415774 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 21 03:51:51 crc kubenswrapper[4685]: I0321 03:51:51.430747 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 21 03:51:51 crc kubenswrapper[4685]: I0321 03:51:51.490386 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 21 03:51:51 crc kubenswrapper[4685]: I0321 03:51:51.518697 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 21 03:51:51 crc kubenswrapper[4685]: I0321 03:51:51.533631 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 21 03:51:51 crc kubenswrapper[4685]: I0321 03:51:51.632402 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 21 03:51:51 crc kubenswrapper[4685]: I0321 03:51:51.639613 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 21 03:51:51 crc kubenswrapper[4685]: I0321 03:51:51.764564 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 21 03:51:51 crc kubenswrapper[4685]: I0321 03:51:51.766360 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 21 03:51:51 crc kubenswrapper[4685]: I0321 03:51:51.801938 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 21 03:51:51 crc kubenswrapper[4685]: I0321 03:51:51.834088 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 21 03:51:51 crc kubenswrapper[4685]: I0321 03:51:51.841407 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 21 03:51:51 crc kubenswrapper[4685]: I0321 03:51:51.861866 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 21 03:51:51 crc kubenswrapper[4685]: I0321 03:51:51.881177 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 21 03:51:51 crc kubenswrapper[4685]: I0321 03:51:51.916700 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 21 03:51:51 crc kubenswrapper[4685]: I0321 03:51:51.932025 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 21 03:51:51 crc kubenswrapper[4685]: I0321 03:51:51.946178 4685 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 21 03:51:52 crc kubenswrapper[4685]: I0321 03:51:52.030797 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 21 03:51:52 crc kubenswrapper[4685]: I0321 03:51:52.066125 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 21 03:51:52 crc kubenswrapper[4685]: I0321 03:51:52.113257 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 21 03:51:52 crc kubenswrapper[4685]: I0321 03:51:52.235065 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 21 03:51:52 crc kubenswrapper[4685]: I0321 03:51:52.264233 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 21 03:51:52 crc kubenswrapper[4685]: I0321 03:51:52.308678 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 21 03:51:52 crc kubenswrapper[4685]: I0321 03:51:52.413339 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 21 03:51:52 crc kubenswrapper[4685]: I0321 03:51:52.603384 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 21 03:51:52 crc kubenswrapper[4685]: I0321 03:51:52.689379 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 21 03:51:52 crc kubenswrapper[4685]: I0321 03:51:52.746530 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 21 03:51:52 crc kubenswrapper[4685]: I0321 03:51:52.765083 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 21 03:51:52 crc kubenswrapper[4685]: I0321 03:51:52.786038 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 21 03:51:52 crc kubenswrapper[4685]: I0321 03:51:52.835364 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 21 03:51:52 crc kubenswrapper[4685]: I0321 03:51:52.863721 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 21 03:51:52 crc kubenswrapper[4685]: I0321 03:51:52.945931 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 21 03:51:52 crc kubenswrapper[4685]: I0321 03:51:52.961453 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 21 03:51:52 crc kubenswrapper[4685]: I0321 03:51:52.968661 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 21 03:51:52 crc kubenswrapper[4685]: I0321 03:51:52.973569 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 21 03:51:53 crc kubenswrapper[4685]: I0321 03:51:53.144319 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 21 03:51:53 crc kubenswrapper[4685]: I0321 03:51:53.156258 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 21 03:51:53 crc kubenswrapper[4685]: I0321 03:51:53.218658 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 21 03:51:53 crc kubenswrapper[4685]: I0321 03:51:53.246184 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 21 03:51:53 crc kubenswrapper[4685]: I0321 03:51:53.260212 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 21 03:51:53 crc kubenswrapper[4685]: I0321 03:51:53.276151 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 21 03:51:53 crc kubenswrapper[4685]: I0321 03:51:53.528575 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 21 03:51:53 crc kubenswrapper[4685]: I0321 03:51:53.533949 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 21 03:51:53 crc kubenswrapper[4685]: I0321 03:51:53.601908 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 21 03:51:53 crc kubenswrapper[4685]: I0321 03:51:53.657050 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 21 03:51:53 crc kubenswrapper[4685]: I0321 03:51:53.684093 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 21 03:51:53 crc kubenswrapper[4685]: I0321 03:51:53.685350 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 21 03:51:53 crc kubenswrapper[4685]: I0321 03:51:53.705051 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 21 03:51:53 crc kubenswrapper[4685]: I0321 03:51:53.815191 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 21 03:51:53 crc kubenswrapper[4685]: I0321 03:51:53.853648 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 21 03:51:53 crc kubenswrapper[4685]: I0321 03:51:53.961927 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 21 03:51:54 crc kubenswrapper[4685]: I0321 03:51:54.023930 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 21 03:51:54 crc kubenswrapper[4685]: I0321 03:51:54.108972 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 21 03:51:54 crc kubenswrapper[4685]: I0321 03:51:54.113390 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 21 03:51:54 crc kubenswrapper[4685]: I0321 03:51:54.265520 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 21 03:51:54 crc kubenswrapper[4685]: I0321 03:51:54.271629 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 21 03:51:54 crc kubenswrapper[4685]: I0321 03:51:54.297773 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 21 03:51:54 crc kubenswrapper[4685]: I0321 03:51:54.468575 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 21 03:51:54 crc kubenswrapper[4685]: I0321 03:51:54.513580 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 21 03:51:54 crc kubenswrapper[4685]: I0321 03:51:54.547808 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 21 03:51:54 crc kubenswrapper[4685]: I0321 03:51:54.686879 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 21 03:51:54 crc kubenswrapper[4685]: I0321 03:51:54.699591 4685 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 21 03:51:54 crc kubenswrapper[4685]: I0321 03:51:54.702031 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 21 03:51:54 crc kubenswrapper[4685]: I0321 03:51:54.707878 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 21 03:51:54 crc kubenswrapper[4685]: I0321 03:51:54.750655 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 21 03:51:54 crc kubenswrapper[4685]: I0321 03:51:54.806747 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 21 03:51:54 crc kubenswrapper[4685]: I0321 03:51:54.810631 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 21 03:51:54 crc kubenswrapper[4685]: I0321 03:51:54.958717 4685 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 21 03:51:55 crc kubenswrapper[4685]: I0321 03:51:55.014407 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 21 03:51:55 crc kubenswrapper[4685]: I0321 03:51:55.057345 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 21 03:51:55 crc kubenswrapper[4685]: I0321 03:51:55.057452 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 21 03:51:55 crc kubenswrapper[4685]: I0321 03:51:55.067622 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 21 03:51:55 crc kubenswrapper[4685]: I0321 03:51:55.070405 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 21 03:51:55 crc kubenswrapper[4685]: I0321 03:51:55.133077 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 21 03:51:55 crc kubenswrapper[4685]: I0321 03:51:55.145879 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 21 03:51:55 crc kubenswrapper[4685]: I0321 03:51:55.270781 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 21 03:51:55 crc kubenswrapper[4685]: I0321 03:51:55.280177 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 21 03:51:55 crc kubenswrapper[4685]: I0321 03:51:55.430119 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 21 03:51:55 crc kubenswrapper[4685]: I0321 03:51:55.446549 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 21 03:51:55 crc kubenswrapper[4685]: I0321 03:51:55.519979 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 21 03:51:55 crc kubenswrapper[4685]: I0321 03:51:55.678073 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 21 03:51:55 crc kubenswrapper[4685]: I0321 03:51:55.743814 4685 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 21 03:51:55 crc kubenswrapper[4685]: I0321 03:51:55.754185 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 21 03:51:55 crc kubenswrapper[4685]: I0321 03:51:55.930202 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 21 03:51:56 crc kubenswrapper[4685]: I0321 03:51:56.050335 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 21 03:51:56 crc kubenswrapper[4685]: I0321 03:51:56.091973 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 21 03:51:56 crc kubenswrapper[4685]: I0321 03:51:56.291046 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 21 03:51:56 crc kubenswrapper[4685]: I0321 03:51:56.370492 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 21 03:51:56 crc kubenswrapper[4685]: I0321 03:51:56.371231 4685 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 21 03:51:56 crc kubenswrapper[4685]: I0321 03:51:56.384779 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 21 03:51:56 crc kubenswrapper[4685]: I0321 03:51:56.417951 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 21 03:51:56 crc kubenswrapper[4685]: I0321 03:51:56.485795 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 21 03:51:56 crc kubenswrapper[4685]: I0321 03:51:56.526063 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 21 03:51:56 crc kubenswrapper[4685]: I0321 03:51:56.594155 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 21 03:51:56 crc kubenswrapper[4685]: I0321 03:51:56.636791 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 21 03:51:56 crc kubenswrapper[4685]: I0321 03:51:56.929166 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 21 03:51:56 crc kubenswrapper[4685]: I0321 03:51:56.957739 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 21 03:51:56 crc kubenswrapper[4685]: I0321 03:51:56.999566 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 21 03:51:57 crc kubenswrapper[4685]: I0321 03:51:57.008347 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 21 03:51:57 crc kubenswrapper[4685]: I0321 03:51:57.101055 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 21 03:51:57 crc kubenswrapper[4685]: I0321 03:51:57.105530 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 21 03:51:57 crc kubenswrapper[4685]: I0321 03:51:57.132727 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 21 03:51:57 crc kubenswrapper[4685]: I0321 03:51:57.335209 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 21 03:51:57 crc kubenswrapper[4685]: I0321 03:51:57.360943 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 21 03:51:57 crc kubenswrapper[4685]: I0321 03:51:57.436831 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 21 03:51:57 crc kubenswrapper[4685]: I0321 03:51:57.482422 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 21 03:51:57 crc kubenswrapper[4685]: I0321 03:51:57.505552 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 21 03:51:57 crc kubenswrapper[4685]: I0321 03:51:57.554656 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 21 03:51:57 crc kubenswrapper[4685]: I0321 03:51:57.596478 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 21 03:51:57 crc kubenswrapper[4685]: I0321 03:51:57.603918 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 21 03:51:57 crc kubenswrapper[4685]: I0321 03:51:57.623880 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 21 03:51:57 crc kubenswrapper[4685]: I0321 03:51:57.635486 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 21 03:51:57 crc kubenswrapper[4685]: I0321 03:51:57.709275 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 21 03:51:57 crc kubenswrapper[4685]: I0321 03:51:57.750811 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 21 03:51:57 crc kubenswrapper[4685]: I0321 03:51:57.770678 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 21 03:51:57 crc kubenswrapper[4685]: I0321 03:51:57.945776 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 21 03:51:58 crc kubenswrapper[4685]: I0321 03:51:58.113212 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 21 03:51:58 crc kubenswrapper[4685]: I0321 03:51:58.224516 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 21 03:51:58 crc kubenswrapper[4685]: I0321 03:51:58.282701 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 21 03:51:58 crc kubenswrapper[4685]: I0321 03:51:58.355683 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 21 03:51:58 crc kubenswrapper[4685]: I0321 03:51:58.752718 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 21 03:51:58 crc kubenswrapper[4685]: I0321 03:51:58.865177 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 21 03:51:58 crc kubenswrapper[4685]: I0321 03:51:58.867446 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 21 03:51:58 crc kubenswrapper[4685]: I0321 03:51:58.886186 4685 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 21 03:51:58 crc kubenswrapper[4685]: I0321 03:51:58.887473 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=45.887452785 podStartE2EDuration="45.887452785s" podCreationTimestamp="2026-03-21 03:51:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 03:51:33.748540614 +0000 UTC m=+326.225609416" watchObservedRunningTime="2026-03-21 03:51:58.887452785 +0000 UTC m=+351.364521607" Mar 21 03:51:58 crc kubenswrapper[4685]: I0321 03:51:58.894094 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-route-controller-manager/route-controller-manager-5bc69c9d55-7trxv","openshift-authentication/oauth-openshift-558db77b4-m7q8h","openshift-controller-manager/controller-manager-75994f449d-jj86b"] Mar 21 03:51:58 crc kubenswrapper[4685]: I0321 03:51:58.894191 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-route-controller-manager/route-controller-manager-57cdb85995-tptvs","openshift-authentication/oauth-openshift-56cf947455-2gb7r","openshift-controller-manager/controller-manager-84cc94bbb5-fzmcp"] Mar 21 03:51:58 crc kubenswrapper[4685]: E0321 03:51:58.894460 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b92b45a-4c33-4463-96c1-0d4227d1d118" containerName="installer" Mar 21 03:51:58 crc kubenswrapper[4685]: I0321 03:51:58.894489 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b92b45a-4c33-4463-96c1-0d4227d1d118" containerName="installer" Mar 21 03:51:58 crc kubenswrapper[4685]: E0321 03:51:58.894512 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce8529fe-4546-466d-bb07-3ee73cf1bc1f" containerName="oauth-openshift" Mar 21 03:51:58 crc kubenswrapper[4685]: I0321 03:51:58.894526 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce8529fe-4546-466d-bb07-3ee73cf1bc1f" containerName="oauth-openshift" Mar 21 03:51:58 crc kubenswrapper[4685]: I0321 03:51:58.894796 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b92b45a-4c33-4463-96c1-0d4227d1d118" containerName="installer" Mar 21 03:51:58 crc kubenswrapper[4685]: I0321 03:51:58.894818 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce8529fe-4546-466d-bb07-3ee73cf1bc1f" containerName="oauth-openshift" Mar 21 03:51:58 crc kubenswrapper[4685]: I0321 03:51:58.894871 4685 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="7fd9d618-b4ed-4942-b915-76dc59fb834a" Mar 21 03:51:58 crc kubenswrapper[4685]: I0321 03:51:58.894931 4685 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="7fd9d618-b4ed-4942-b915-76dc59fb834a" Mar 21 03:51:58 crc kubenswrapper[4685]: I0321 03:51:58.895610 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-56cf947455-2gb7r" Mar 21 03:51:58 crc kubenswrapper[4685]: I0321 03:51:58.897415 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57cdb85995-tptvs" Mar 21 03:51:58 crc kubenswrapper[4685]: I0321 03:51:58.899028 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-84cc94bbb5-fzmcp" Mar 21 03:51:58 crc kubenswrapper[4685]: I0321 03:51:58.900656 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 21 03:51:58 crc kubenswrapper[4685]: I0321 03:51:58.901071 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 21 03:51:58 crc kubenswrapper[4685]: I0321 03:51:58.901632 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 21 03:51:58 crc kubenswrapper[4685]: I0321 03:51:58.902552 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 21 03:51:58 crc kubenswrapper[4685]: I0321 03:51:58.902921 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 03:51:58 crc kubenswrapper[4685]: I0321 03:51:58.903308 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 21 03:51:58 crc kubenswrapper[4685]: I0321 03:51:58.903949 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 21 03:51:58 crc kubenswrapper[4685]: I0321 03:51:58.904025 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 21 03:51:58 crc kubenswrapper[4685]: I0321 03:51:58.904072 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 21 03:51:58 crc kubenswrapper[4685]: I0321 03:51:58.904145 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 21 03:51:58 crc kubenswrapper[4685]: I0321 03:51:58.904257 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 21 03:51:58 crc kubenswrapper[4685]: I0321 03:51:58.904366 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 21 03:51:58 crc kubenswrapper[4685]: I0321 03:51:58.904478 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 21 03:51:58 crc kubenswrapper[4685]: I0321 03:51:58.904378 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 21 03:51:58 crc kubenswrapper[4685]: I0321 03:51:58.904505 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 21 03:51:58 crc kubenswrapper[4685]: I0321 03:51:58.904642 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 21 03:51:58 crc kubenswrapper[4685]: I0321 03:51:58.905186 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 21 03:51:58 crc kubenswrapper[4685]: I0321 03:51:58.905369 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 21 03:51:58 crc kubenswrapper[4685]: I0321 03:51:58.905831 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 21 03:51:58 crc kubenswrapper[4685]: I0321 03:51:58.906271 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 21 03:51:58 crc kubenswrapper[4685]: I0321 03:51:58.906501 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 21 03:51:58 crc kubenswrapper[4685]: I0321 03:51:58.908620 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 21 03:51:58 crc kubenswrapper[4685]: I0321 03:51:58.909329 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 21 03:51:58 crc kubenswrapper[4685]: I0321 03:51:58.918297 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 21 03:51:58 crc kubenswrapper[4685]: I0321 03:51:58.919917 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 21 03:51:58 crc kubenswrapper[4685]: I0321 03:51:58.921913 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 21 03:51:58 crc kubenswrapper[4685]: I0321 03:51:58.922499 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 21 03:51:58 crc kubenswrapper[4685]: I0321 03:51:58.924200 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 21 03:51:58 crc kubenswrapper[4685]: I0321 03:51:58.931026 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 21 03:51:58 crc kubenswrapper[4685]: I0321 03:51:58.941153 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 21 03:51:58 crc kubenswrapper[4685]: I0321 03:51:58.949631 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/a7133c8b-7aed-4976-a349-18af9d78c198-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-56cf947455-2gb7r\" (UID: \"a7133c8b-7aed-4976-a349-18af9d78c198\") " pod="openshift-authentication/oauth-openshift-56cf947455-2gb7r" Mar 21 03:51:58 crc kubenswrapper[4685]: I0321 03:51:58.949703 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cacbfdeb-1874-4a02-bdde-c5e06143d710-client-ca\") pod \"controller-manager-84cc94bbb5-fzmcp\" (UID: \"cacbfdeb-1874-4a02-bdde-c5e06143d710\") " pod="openshift-controller-manager/controller-manager-84cc94bbb5-fzmcp" Mar 21 03:51:58 crc kubenswrapper[4685]: I0321 03:51:58.949743 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3184a56e-acae-47d4-9dd6-9ee8c0e0d924-config\") pod \"route-controller-manager-57cdb85995-tptvs\" (UID: \"3184a56e-acae-47d4-9dd6-9ee8c0e0d924\") " pod="openshift-route-controller-manager/route-controller-manager-57cdb85995-tptvs" Mar 21 03:51:58 crc kubenswrapper[4685]: I0321 03:51:58.949772 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a7133c8b-7aed-4976-a349-18af9d78c198-v4-0-config-system-cliconfig\") pod \"oauth-openshift-56cf947455-2gb7r\" (UID: \"a7133c8b-7aed-4976-a349-18af9d78c198\") " pod="openshift-authentication/oauth-openshift-56cf947455-2gb7r" Mar 21 03:51:58 crc kubenswrapper[4685]: I0321 03:51:58.949797 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a7133c8b-7aed-4976-a349-18af9d78c198-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-56cf947455-2gb7r\" (UID: \"a7133c8b-7aed-4976-a349-18af9d78c198\") " pod="openshift-authentication/oauth-openshift-56cf947455-2gb7r" Mar 21 03:51:58 crc kubenswrapper[4685]: I0321 03:51:58.949828 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tmjg\" (UniqueName: \"kubernetes.io/projected/3184a56e-acae-47d4-9dd6-9ee8c0e0d924-kube-api-access-4tmjg\") pod \"route-controller-manager-57cdb85995-tptvs\" (UID: \"3184a56e-acae-47d4-9dd6-9ee8c0e0d924\") " pod="openshift-route-controller-manager/route-controller-manager-57cdb85995-tptvs" Mar 21 03:51:58 crc kubenswrapper[4685]: I0321 03:51:58.949876 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a7133c8b-7aed-4976-a349-18af9d78c198-v4-0-config-system-serving-cert\") pod \"oauth-openshift-56cf947455-2gb7r\" (UID: \"a7133c8b-7aed-4976-a349-18af9d78c198\") " pod="openshift-authentication/oauth-openshift-56cf947455-2gb7r" Mar 21 03:51:58 crc kubenswrapper[4685]: I0321 03:51:58.949920 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a7133c8b-7aed-4976-a349-18af9d78c198-v4-0-config-user-template-login\") pod \"oauth-openshift-56cf947455-2gb7r\" (UID: \"a7133c8b-7aed-4976-a349-18af9d78c198\") " pod="openshift-authentication/oauth-openshift-56cf947455-2gb7r" Mar 21 03:51:58 crc kubenswrapper[4685]: I0321 03:51:58.949953 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a7133c8b-7aed-4976-a349-18af9d78c198-v4-0-config-system-session\") pod \"oauth-openshift-56cf947455-2gb7r\" (UID: \"a7133c8b-7aed-4976-a349-18af9d78c198\") " pod="openshift-authentication/oauth-openshift-56cf947455-2gb7r" Mar 21 03:51:58 crc kubenswrapper[4685]: I0321 03:51:58.949976 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a7133c8b-7aed-4976-a349-18af9d78c198-v4-0-config-user-template-error\") pod \"oauth-openshift-56cf947455-2gb7r\" (UID: \"a7133c8b-7aed-4976-a349-18af9d78c198\") " pod="openshift-authentication/oauth-openshift-56cf947455-2gb7r" Mar 21 03:51:58 crc kubenswrapper[4685]: I0321 03:51:58.950011 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a7133c8b-7aed-4976-a349-18af9d78c198-v4-0-config-system-router-certs\") pod \"oauth-openshift-56cf947455-2gb7r\" (UID: \"a7133c8b-7aed-4976-a349-18af9d78c198\") " pod="openshift-authentication/oauth-openshift-56cf947455-2gb7r" Mar 21 03:51:58 crc kubenswrapper[4685]: I0321 03:51:58.950039 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a7133c8b-7aed-4976-a349-18af9d78c198-v4-0-config-system-service-ca\") pod \"oauth-openshift-56cf947455-2gb7r\" (UID: \"a7133c8b-7aed-4976-a349-18af9d78c198\") " pod="openshift-authentication/oauth-openshift-56cf947455-2gb7r" Mar 21 03:51:58 crc kubenswrapper[4685]: I0321 03:51:58.950066 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfcjw\" (UniqueName: \"kubernetes.io/projected/a7133c8b-7aed-4976-a349-18af9d78c198-kube-api-access-bfcjw\") pod \"oauth-openshift-56cf947455-2gb7r\" (UID: \"a7133c8b-7aed-4976-a349-18af9d78c198\") " pod="openshift-authentication/oauth-openshift-56cf947455-2gb7r" Mar 21 03:51:58 crc kubenswrapper[4685]: I0321 03:51:58.950089 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a7133c8b-7aed-4976-a349-18af9d78c198-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-56cf947455-2gb7r\" (UID: \"a7133c8b-7aed-4976-a349-18af9d78c198\") " pod="openshift-authentication/oauth-openshift-56cf947455-2gb7r" Mar 21 03:51:58 crc kubenswrapper[4685]: I0321 03:51:58.950120 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cacbfdeb-1874-4a02-bdde-c5e06143d710-config\") pod \"controller-manager-84cc94bbb5-fzmcp\" (UID: \"cacbfdeb-1874-4a02-bdde-c5e06143d710\") " pod="openshift-controller-manager/controller-manager-84cc94bbb5-fzmcp" Mar 21 03:51:58 crc kubenswrapper[4685]: I0321 03:51:58.950150 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3184a56e-acae-47d4-9dd6-9ee8c0e0d924-client-ca\") pod \"route-controller-manager-57cdb85995-tptvs\" (UID: \"3184a56e-acae-47d4-9dd6-9ee8c0e0d924\") " pod="openshift-route-controller-manager/route-controller-manager-57cdb85995-tptvs" Mar 21 03:51:58 crc kubenswrapper[4685]: I0321 03:51:58.950187 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a7133c8b-7aed-4976-a349-18af9d78c198-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-56cf947455-2gb7r\" (UID: \"a7133c8b-7aed-4976-a349-18af9d78c198\") " pod="openshift-authentication/oauth-openshift-56cf947455-2gb7r" Mar 21 03:51:58 crc kubenswrapper[4685]: I0321 03:51:58.950212 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a7133c8b-7aed-4976-a349-18af9d78c198-audit-dir\") pod \"oauth-openshift-56cf947455-2gb7r\" (UID: \"a7133c8b-7aed-4976-a349-18af9d78c198\") " pod="openshift-authentication/oauth-openshift-56cf947455-2gb7r" Mar 21 03:51:58 crc kubenswrapper[4685]: I0321 03:51:58.950240 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cacbfdeb-1874-4a02-bdde-c5e06143d710-serving-cert\") pod \"controller-manager-84cc94bbb5-fzmcp\" (UID: \"cacbfdeb-1874-4a02-bdde-c5e06143d710\") " pod="openshift-controller-manager/controller-manager-84cc94bbb5-fzmcp" Mar 21 03:51:58 crc kubenswrapper[4685]: I0321 03:51:58.950270 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrnnq\" (UniqueName: \"kubernetes.io/projected/cacbfdeb-1874-4a02-bdde-c5e06143d710-kube-api-access-wrnnq\") pod \"controller-manager-84cc94bbb5-fzmcp\" (UID: \"cacbfdeb-1874-4a02-bdde-c5e06143d710\") " pod="openshift-controller-manager/controller-manager-84cc94bbb5-fzmcp" Mar 21 03:51:58 crc kubenswrapper[4685]: I0321 03:51:58.950306 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3184a56e-acae-47d4-9dd6-9ee8c0e0d924-serving-cert\") pod \"route-controller-manager-57cdb85995-tptvs\" (UID: \"3184a56e-acae-47d4-9dd6-9ee8c0e0d924\") " pod="openshift-route-controller-manager/route-controller-manager-57cdb85995-tptvs" Mar 21 03:51:58 crc kubenswrapper[4685]: I0321 03:51:58.950343 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a7133c8b-7aed-4976-a349-18af9d78c198-audit-policies\") pod \"oauth-openshift-56cf947455-2gb7r\" (UID: \"a7133c8b-7aed-4976-a349-18af9d78c198\") " pod="openshift-authentication/oauth-openshift-56cf947455-2gb7r" Mar 21 03:51:58 crc kubenswrapper[4685]: I0321 03:51:58.950376 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cacbfdeb-1874-4a02-bdde-c5e06143d710-proxy-ca-bundles\") pod \"controller-manager-84cc94bbb5-fzmcp\" (UID: \"cacbfdeb-1874-4a02-bdde-c5e06143d710\") " pod="openshift-controller-manager/controller-manager-84cc94bbb5-fzmcp" Mar 21 03:51:59 crc kubenswrapper[4685]: I0321 03:51:59.002954 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=25.002938981 podStartE2EDuration="25.002938981s" podCreationTimestamp="2026-03-21 03:51:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 03:51:58.97732814 +0000 UTC m=+351.454396972" watchObservedRunningTime="2026-03-21 03:51:59.002938981 +0000 UTC m=+351.480007773" Mar 21 03:51:59 crc kubenswrapper[4685]: I0321 03:51:59.005954 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 21 03:51:59 crc kubenswrapper[4685]: I0321 03:51:59.029506 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 21 03:51:59 crc kubenswrapper[4685]: I0321 03:51:59.051670 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a7133c8b-7aed-4976-a349-18af9d78c198-v4-0-config-system-router-certs\") pod \"oauth-openshift-56cf947455-2gb7r\" (UID: \"a7133c8b-7aed-4976-a349-18af9d78c198\") " pod="openshift-authentication/oauth-openshift-56cf947455-2gb7r" Mar 21 03:51:59 crc kubenswrapper[4685]: I0321 03:51:59.051715 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a7133c8b-7aed-4976-a349-18af9d78c198-v4-0-config-system-service-ca\") pod \"oauth-openshift-56cf947455-2gb7r\" (UID: \"a7133c8b-7aed-4976-a349-18af9d78c198\") " pod="openshift-authentication/oauth-openshift-56cf947455-2gb7r" Mar 21 03:51:59 crc kubenswrapper[4685]: I0321 03:51:59.051735 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfcjw\" (UniqueName: \"kubernetes.io/projected/a7133c8b-7aed-4976-a349-18af9d78c198-kube-api-access-bfcjw\") pod \"oauth-openshift-56cf947455-2gb7r\" (UID: \"a7133c8b-7aed-4976-a349-18af9d78c198\") " pod="openshift-authentication/oauth-openshift-56cf947455-2gb7r" Mar 21 03:51:59 crc kubenswrapper[4685]: I0321 03:51:59.051759 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a7133c8b-7aed-4976-a349-18af9d78c198-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-56cf947455-2gb7r\" (UID: \"a7133c8b-7aed-4976-a349-18af9d78c198\") " pod="openshift-authentication/oauth-openshift-56cf947455-2gb7r" Mar 21 03:51:59 crc kubenswrapper[4685]: I0321 03:51:59.051788 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cacbfdeb-1874-4a02-bdde-c5e06143d710-config\") pod \"controller-manager-84cc94bbb5-fzmcp\" (UID: \"cacbfdeb-1874-4a02-bdde-c5e06143d710\") " pod="openshift-controller-manager/controller-manager-84cc94bbb5-fzmcp" Mar 21 03:51:59 crc kubenswrapper[4685]: I0321 03:51:59.051819 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3184a56e-acae-47d4-9dd6-9ee8c0e0d924-client-ca\") pod \"route-controller-manager-57cdb85995-tptvs\" (UID: \"3184a56e-acae-47d4-9dd6-9ee8c0e0d924\") " pod="openshift-route-controller-manager/route-controller-manager-57cdb85995-tptvs" Mar 21 03:51:59 crc kubenswrapper[4685]: I0321 03:51:59.051859 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a7133c8b-7aed-4976-a349-18af9d78c198-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-56cf947455-2gb7r\" (UID: \"a7133c8b-7aed-4976-a349-18af9d78c198\") " pod="openshift-authentication/oauth-openshift-56cf947455-2gb7r" Mar 21 03:51:59 crc kubenswrapper[4685]: I0321 03:51:59.051885 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a7133c8b-7aed-4976-a349-18af9d78c198-audit-dir\") pod \"oauth-openshift-56cf947455-2gb7r\" (UID: \"a7133c8b-7aed-4976-a349-18af9d78c198\") " pod="openshift-authentication/oauth-openshift-56cf947455-2gb7r" Mar 21 03:51:59 crc kubenswrapper[4685]: I0321 03:51:59.051906 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cacbfdeb-1874-4a02-bdde-c5e06143d710-serving-cert\") pod \"controller-manager-84cc94bbb5-fzmcp\" (UID: \"cacbfdeb-1874-4a02-bdde-c5e06143d710\") " pod="openshift-controller-manager/controller-manager-84cc94bbb5-fzmcp" Mar 21 03:51:59 crc kubenswrapper[4685]: I0321 03:51:59.051927 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrnnq\" (UniqueName: \"kubernetes.io/projected/cacbfdeb-1874-4a02-bdde-c5e06143d710-kube-api-access-wrnnq\") pod \"controller-manager-84cc94bbb5-fzmcp\" (UID: \"cacbfdeb-1874-4a02-bdde-c5e06143d710\") " pod="openshift-controller-manager/controller-manager-84cc94bbb5-fzmcp" Mar 21 03:51:59 crc kubenswrapper[4685]: I0321 03:51:59.051949 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3184a56e-acae-47d4-9dd6-9ee8c0e0d924-serving-cert\") pod \"route-controller-manager-57cdb85995-tptvs\" (UID: \"3184a56e-acae-47d4-9dd6-9ee8c0e0d924\") " pod="openshift-route-controller-manager/route-controller-manager-57cdb85995-tptvs" Mar 21 03:51:59 crc kubenswrapper[4685]: I0321 03:51:59.051968 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a7133c8b-7aed-4976-a349-18af9d78c198-audit-policies\") pod \"oauth-openshift-56cf947455-2gb7r\" (UID: \"a7133c8b-7aed-4976-a349-18af9d78c198\") " pod="openshift-authentication/oauth-openshift-56cf947455-2gb7r" Mar 21 03:51:59 crc kubenswrapper[4685]: I0321 03:51:59.051989 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cacbfdeb-1874-4a02-bdde-c5e06143d710-proxy-ca-bundles\") pod \"controller-manager-84cc94bbb5-fzmcp\" (UID: \"cacbfdeb-1874-4a02-bdde-c5e06143d710\") " pod="openshift-controller-manager/controller-manager-84cc94bbb5-fzmcp" Mar 21 03:51:59 crc kubenswrapper[4685]: I0321 03:51:59.052010 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/a7133c8b-7aed-4976-a349-18af9d78c198-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-56cf947455-2gb7r\" (UID: \"a7133c8b-7aed-4976-a349-18af9d78c198\") " pod="openshift-authentication/oauth-openshift-56cf947455-2gb7r" Mar 21 03:51:59 crc kubenswrapper[4685]: I0321 03:51:59.052031 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cacbfdeb-1874-4a02-bdde-c5e06143d710-client-ca\") pod \"controller-manager-84cc94bbb5-fzmcp\" (UID: \"cacbfdeb-1874-4a02-bdde-c5e06143d710\") " pod="openshift-controller-manager/controller-manager-84cc94bbb5-fzmcp" Mar 21 03:51:59 crc kubenswrapper[4685]: I0321 03:51:59.052049 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3184a56e-acae-47d4-9dd6-9ee8c0e0d924-config\") pod \"route-controller-manager-57cdb85995-tptvs\" (UID: \"3184a56e-acae-47d4-9dd6-9ee8c0e0d924\") " pod="openshift-route-controller-manager/route-controller-manager-57cdb85995-tptvs" Mar 21 03:51:59 crc kubenswrapper[4685]: I0321 03:51:59.052064 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a7133c8b-7aed-4976-a349-18af9d78c198-v4-0-config-system-cliconfig\") pod \"oauth-openshift-56cf947455-2gb7r\" (UID: \"a7133c8b-7aed-4976-a349-18af9d78c198\") " pod="openshift-authentication/oauth-openshift-56cf947455-2gb7r" Mar 21 03:51:59 crc kubenswrapper[4685]: I0321 03:51:59.052082 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a7133c8b-7aed-4976-a349-18af9d78c198-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-56cf947455-2gb7r\" (UID: \"a7133c8b-7aed-4976-a349-18af9d78c198\") " pod="openshift-authentication/oauth-openshift-56cf947455-2gb7r" Mar 21 03:51:59 crc kubenswrapper[4685]: I0321 03:51:59.052100 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tmjg\" (UniqueName: \"kubernetes.io/projected/3184a56e-acae-47d4-9dd6-9ee8c0e0d924-kube-api-access-4tmjg\") pod \"route-controller-manager-57cdb85995-tptvs\" (UID: \"3184a56e-acae-47d4-9dd6-9ee8c0e0d924\") " pod="openshift-route-controller-manager/route-controller-manager-57cdb85995-tptvs" Mar 21 03:51:59 crc kubenswrapper[4685]: I0321 03:51:59.052115 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a7133c8b-7aed-4976-a349-18af9d78c198-v4-0-config-system-serving-cert\") pod \"oauth-openshift-56cf947455-2gb7r\" (UID: \"a7133c8b-7aed-4976-a349-18af9d78c198\") " pod="openshift-authentication/oauth-openshift-56cf947455-2gb7r" Mar 21 03:51:59 crc kubenswrapper[4685]: I0321 03:51:59.052141 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a7133c8b-7aed-4976-a349-18af9d78c198-v4-0-config-user-template-login\") pod \"oauth-openshift-56cf947455-2gb7r\" (UID: \"a7133c8b-7aed-4976-a349-18af9d78c198\") " pod="openshift-authentication/oauth-openshift-56cf947455-2gb7r" Mar 21 03:51:59 crc kubenswrapper[4685]: I0321 03:51:59.052156 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a7133c8b-7aed-4976-a349-18af9d78c198-v4-0-config-system-session\") pod \"oauth-openshift-56cf947455-2gb7r\" (UID: \"a7133c8b-7aed-4976-a349-18af9d78c198\") " pod="openshift-authentication/oauth-openshift-56cf947455-2gb7r" Mar 21 03:51:59 crc kubenswrapper[4685]: I0321 03:51:59.052172 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a7133c8b-7aed-4976-a349-18af9d78c198-v4-0-config-user-template-error\") pod \"oauth-openshift-56cf947455-2gb7r\" (UID: \"a7133c8b-7aed-4976-a349-18af9d78c198\") " pod="openshift-authentication/oauth-openshift-56cf947455-2gb7r" Mar 21 03:51:59 crc kubenswrapper[4685]: I0321 03:51:59.054051 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a7133c8b-7aed-4976-a349-18af9d78c198-v4-0-config-system-cliconfig\") pod \"oauth-openshift-56cf947455-2gb7r\" (UID: \"a7133c8b-7aed-4976-a349-18af9d78c198\") " pod="openshift-authentication/oauth-openshift-56cf947455-2gb7r" Mar 21 03:51:59 crc kubenswrapper[4685]: I0321 03:51:59.054099 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cacbfdeb-1874-4a02-bdde-c5e06143d710-client-ca\") pod \"controller-manager-84cc94bbb5-fzmcp\" (UID: \"cacbfdeb-1874-4a02-bdde-c5e06143d710\") " pod="openshift-controller-manager/controller-manager-84cc94bbb5-fzmcp" Mar 21 03:51:59 crc kubenswrapper[4685]: I0321 03:51:59.054559 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a7133c8b-7aed-4976-a349-18af9d78c198-audit-policies\") pod \"oauth-openshift-56cf947455-2gb7r\" (UID: \"a7133c8b-7aed-4976-a349-18af9d78c198\") " pod="openshift-authentication/oauth-openshift-56cf947455-2gb7r" Mar 21 03:51:59 crc kubenswrapper[4685]: I0321 03:51:59.054566 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cacbfdeb-1874-4a02-bdde-c5e06143d710-config\") pod \"controller-manager-84cc94bbb5-fzmcp\" (UID: \"cacbfdeb-1874-4a02-bdde-c5e06143d710\") " pod="openshift-controller-manager/controller-manager-84cc94bbb5-fzmcp" Mar 21 03:51:59 crc kubenswrapper[4685]: I0321 03:51:59.054705 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a7133c8b-7aed-4976-a349-18af9d78c198-v4-0-config-system-service-ca\") pod \"oauth-openshift-56cf947455-2gb7r\" (UID: \"a7133c8b-7aed-4976-a349-18af9d78c198\") " pod="openshift-authentication/oauth-openshift-56cf947455-2gb7r" Mar 21 03:51:59 crc kubenswrapper[4685]: I0321 03:51:59.054772 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3184a56e-acae-47d4-9dd6-9ee8c0e0d924-config\") pod \"route-controller-manager-57cdb85995-tptvs\" (UID: \"3184a56e-acae-47d4-9dd6-9ee8c0e0d924\") " pod="openshift-route-controller-manager/route-controller-manager-57cdb85995-tptvs" Mar 21 03:51:59 crc kubenswrapper[4685]: I0321 03:51:59.055258 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a7133c8b-7aed-4976-a349-18af9d78c198-audit-dir\") pod \"oauth-openshift-56cf947455-2gb7r\" (UID: \"a7133c8b-7aed-4976-a349-18af9d78c198\") " pod="openshift-authentication/oauth-openshift-56cf947455-2gb7r" Mar 21 03:51:59 crc kubenswrapper[4685]: I0321 03:51:59.056189 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cacbfdeb-1874-4a02-bdde-c5e06143d710-proxy-ca-bundles\") pod \"controller-manager-84cc94bbb5-fzmcp\" (UID: \"cacbfdeb-1874-4a02-bdde-c5e06143d710\") " pod="openshift-controller-manager/controller-manager-84cc94bbb5-fzmcp" Mar 21 03:51:59 crc kubenswrapper[4685]: I0321 03:51:59.056222 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3184a56e-acae-47d4-9dd6-9ee8c0e0d924-client-ca\") pod \"route-controller-manager-57cdb85995-tptvs\" (UID: \"3184a56e-acae-47d4-9dd6-9ee8c0e0d924\") " pod="openshift-route-controller-manager/route-controller-manager-57cdb85995-tptvs" Mar 21 03:51:59 crc kubenswrapper[4685]: I0321 03:51:59.057929 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/a7133c8b-7aed-4976-a349-18af9d78c198-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-56cf947455-2gb7r\" (UID: \"a7133c8b-7aed-4976-a349-18af9d78c198\") " pod="openshift-authentication/oauth-openshift-56cf947455-2gb7r" Mar 21 03:51:59 crc kubenswrapper[4685]: I0321 03:51:59.058605 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a7133c8b-7aed-4976-a349-18af9d78c198-v4-0-config-system-serving-cert\") pod \"oauth-openshift-56cf947455-2gb7r\" (UID: \"a7133c8b-7aed-4976-a349-18af9d78c198\") " pod="openshift-authentication/oauth-openshift-56cf947455-2gb7r" Mar 21 03:51:59 crc kubenswrapper[4685]: I0321 03:51:59.064386 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a7133c8b-7aed-4976-a349-18af9d78c198-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-56cf947455-2gb7r\" (UID: \"a7133c8b-7aed-4976-a349-18af9d78c198\") " pod="openshift-authentication/oauth-openshift-56cf947455-2gb7r" Mar 21 03:51:59 crc kubenswrapper[4685]: I0321 03:51:59.065033 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a7133c8b-7aed-4976-a349-18af9d78c198-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-56cf947455-2gb7r\" (UID: \"a7133c8b-7aed-4976-a349-18af9d78c198\") " pod="openshift-authentication/oauth-openshift-56cf947455-2gb7r" Mar 21 03:51:59 crc kubenswrapper[4685]: I0321 03:51:59.065967 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a7133c8b-7aed-4976-a349-18af9d78c198-v4-0-config-user-template-login\") pod \"oauth-openshift-56cf947455-2gb7r\" (UID: \"a7133c8b-7aed-4976-a349-18af9d78c198\") " pod="openshift-authentication/oauth-openshift-56cf947455-2gb7r" Mar 21 03:51:59 crc kubenswrapper[4685]: I0321 03:51:59.068166 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cacbfdeb-1874-4a02-bdde-c5e06143d710-serving-cert\") pod \"controller-manager-84cc94bbb5-fzmcp\" (UID: \"cacbfdeb-1874-4a02-bdde-c5e06143d710\") " pod="openshift-controller-manager/controller-manager-84cc94bbb5-fzmcp" Mar 21 03:51:59 crc kubenswrapper[4685]: I0321 03:51:59.070220 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a7133c8b-7aed-4976-a349-18af9d78c198-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-56cf947455-2gb7r\" (UID: \"a7133c8b-7aed-4976-a349-18af9d78c198\") " pod="openshift-authentication/oauth-openshift-56cf947455-2gb7r" Mar 21 03:51:59 crc kubenswrapper[4685]: I0321 03:51:59.070404 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a7133c8b-7aed-4976-a349-18af9d78c198-v4-0-config-system-session\") pod \"oauth-openshift-56cf947455-2gb7r\" (UID: \"a7133c8b-7aed-4976-a349-18af9d78c198\") " pod="openshift-authentication/oauth-openshift-56cf947455-2gb7r" Mar 21 03:51:59 crc kubenswrapper[4685]: I0321 03:51:59.070940 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a7133c8b-7aed-4976-a349-18af9d78c198-v4-0-config-system-router-certs\") pod \"oauth-openshift-56cf947455-2gb7r\" (UID: \"a7133c8b-7aed-4976-a349-18af9d78c198\") " pod="openshift-authentication/oauth-openshift-56cf947455-2gb7r" Mar 21 03:51:59 crc kubenswrapper[4685]: I0321 03:51:59.071470 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3184a56e-acae-47d4-9dd6-9ee8c0e0d924-serving-cert\") pod \"route-controller-manager-57cdb85995-tptvs\" (UID: \"3184a56e-acae-47d4-9dd6-9ee8c0e0d924\") " pod="openshift-route-controller-manager/route-controller-manager-57cdb85995-tptvs" Mar 21 03:51:59 crc kubenswrapper[4685]: I0321 03:51:59.073710 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tmjg\" (UniqueName: \"kubernetes.io/projected/3184a56e-acae-47d4-9dd6-9ee8c0e0d924-kube-api-access-4tmjg\") pod \"route-controller-manager-57cdb85995-tptvs\" (UID: \"3184a56e-acae-47d4-9dd6-9ee8c0e0d924\") " pod="openshift-route-controller-manager/route-controller-manager-57cdb85995-tptvs" Mar 21 03:51:59 crc kubenswrapper[4685]: I0321 03:51:59.076023 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfcjw\" (UniqueName: \"kubernetes.io/projected/a7133c8b-7aed-4976-a349-18af9d78c198-kube-api-access-bfcjw\") pod \"oauth-openshift-56cf947455-2gb7r\" (UID: \"a7133c8b-7aed-4976-a349-18af9d78c198\") " pod="openshift-authentication/oauth-openshift-56cf947455-2gb7r" Mar 21 03:51:59 crc kubenswrapper[4685]: I0321 03:51:59.078064 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrnnq\" (UniqueName: \"kubernetes.io/projected/cacbfdeb-1874-4a02-bdde-c5e06143d710-kube-api-access-wrnnq\") pod \"controller-manager-84cc94bbb5-fzmcp\" (UID: \"cacbfdeb-1874-4a02-bdde-c5e06143d710\") " pod="openshift-controller-manager/controller-manager-84cc94bbb5-fzmcp" Mar 21 03:51:59 crc kubenswrapper[4685]: I0321 03:51:59.082381 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a7133c8b-7aed-4976-a349-18af9d78c198-v4-0-config-user-template-error\") pod \"oauth-openshift-56cf947455-2gb7r\" (UID: \"a7133c8b-7aed-4976-a349-18af9d78c198\") " pod="openshift-authentication/oauth-openshift-56cf947455-2gb7r" Mar 21 03:51:59 crc kubenswrapper[4685]: I0321 03:51:59.096889 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 21 03:51:59 crc kubenswrapper[4685]: I0321 03:51:59.115790 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 21 03:51:59 crc kubenswrapper[4685]: I0321 03:51:59.219960 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-56cf947455-2gb7r" Mar 21 03:51:59 crc kubenswrapper[4685]: I0321 03:51:59.237067 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57cdb85995-tptvs" Mar 21 03:51:59 crc kubenswrapper[4685]: I0321 03:51:59.244304 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-84cc94bbb5-fzmcp" Mar 21 03:51:59 crc kubenswrapper[4685]: I0321 03:51:59.403010 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 21 03:51:59 crc kubenswrapper[4685]: I0321 03:51:59.515193 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 21 03:51:59 crc kubenswrapper[4685]: I0321 03:51:59.559180 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-84cc94bbb5-fzmcp"] Mar 21 03:51:59 crc kubenswrapper[4685]: W0321 03:51:59.563636 4685 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcacbfdeb_1874_4a02_bdde_c5e06143d710.slice/crio-787cd80929c32cb0fb5c3666dcbee5c16c8a7e2fb7ab7a89b99c68a9be7bb122 WatchSource:0}: Error finding container 787cd80929c32cb0fb5c3666dcbee5c16c8a7e2fb7ab7a89b99c68a9be7bb122: Status 404 returned error can't find the container with id 787cd80929c32cb0fb5c3666dcbee5c16c8a7e2fb7ab7a89b99c68a9be7bb122 Mar 21 03:51:59 crc kubenswrapper[4685]: I0321 03:51:59.599411 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 21 03:51:59 crc kubenswrapper[4685]: I0321 03:51:59.709792 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57cdb85995-tptvs"] Mar 21 03:51:59 crc kubenswrapper[4685]: W0321 03:51:59.713397 4685 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3184a56e_acae_47d4_9dd6_9ee8c0e0d924.slice/crio-b9e530abfa1410e338ab67b844dd94d751ae3044ab4352edba489642869bf0bc WatchSource:0}: Error finding container b9e530abfa1410e338ab67b844dd94d751ae3044ab4352edba489642869bf0bc: Status 404 returned error can't find the container with id b9e530abfa1410e338ab67b844dd94d751ae3044ab4352edba489642869bf0bc Mar 21 03:51:59 crc kubenswrapper[4685]: I0321 03:51:59.714441 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-56cf947455-2gb7r"] Mar 21 03:51:59 crc kubenswrapper[4685]: W0321 03:51:59.717030 4685 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7133c8b_7aed_4976_a349_18af9d78c198.slice/crio-95f3486809b912ec7feb31b995a09ccbc1ef089e64e3f9d9a457fe06465e75c5 WatchSource:0}: Error finding container 95f3486809b912ec7feb31b995a09ccbc1ef089e64e3f9d9a457fe06465e75c5: Status 404 returned error can't find the container with id 95f3486809b912ec7feb31b995a09ccbc1ef089e64e3f9d9a457fe06465e75c5 Mar 21 03:51:59 crc kubenswrapper[4685]: I0321 03:51:59.828948 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 21 03:52:00 crc kubenswrapper[4685]: I0321 03:52:00.159794 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567752-khgzj"] Mar 21 03:52:00 crc kubenswrapper[4685]: I0321 03:52:00.160411 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567752-khgzj" Mar 21 03:52:00 crc kubenswrapper[4685]: I0321 03:52:00.165745 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 03:52:00 crc kubenswrapper[4685]: I0321 03:52:00.165861 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k75cc" Mar 21 03:52:00 crc kubenswrapper[4685]: I0321 03:52:00.166119 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 03:52:00 crc kubenswrapper[4685]: I0321 03:52:00.169850 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567752-khgzj"] Mar 21 03:52:00 crc kubenswrapper[4685]: I0321 03:52:00.171194 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 21 03:52:00 crc kubenswrapper[4685]: I0321 03:52:00.266152 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 21 03:52:00 crc kubenswrapper[4685]: I0321 03:52:00.270312 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zrvq\" (UniqueName: \"kubernetes.io/projected/795b0bef-045f-4b6d-8b0b-60b79ccbded1-kube-api-access-5zrvq\") pod \"auto-csr-approver-29567752-khgzj\" (UID: \"795b0bef-045f-4b6d-8b0b-60b79ccbded1\") " pod="openshift-infra/auto-csr-approver-29567752-khgzj" Mar 21 03:52:00 crc kubenswrapper[4685]: I0321 03:52:00.308751 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce8529fe-4546-466d-bb07-3ee73cf1bc1f" path="/var/lib/kubelet/pods/ce8529fe-4546-466d-bb07-3ee73cf1bc1f/volumes" Mar 21 03:52:00 crc kubenswrapper[4685]: I0321 03:52:00.309454 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de82f429-1aa8-465f-b944-2bfb17d7d26b" path="/var/lib/kubelet/pods/de82f429-1aa8-465f-b944-2bfb17d7d26b/volumes" Mar 21 03:52:00 crc kubenswrapper[4685]: I0321 03:52:00.309998 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ede8a25b-fb4b-4cbc-914d-1fc24155b8f1" path="/var/lib/kubelet/pods/ede8a25b-fb4b-4cbc-914d-1fc24155b8f1/volumes" Mar 21 03:52:00 crc kubenswrapper[4685]: I0321 03:52:00.313105 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-84cc94bbb5-fzmcp" event={"ID":"cacbfdeb-1874-4a02-bdde-c5e06143d710","Type":"ContainerStarted","Data":"7cffa4faf9820ed74a7af533468998c3f050ffe93f6c7179183f50ef54a06c6e"} Mar 21 03:52:00 crc kubenswrapper[4685]: I0321 03:52:00.313196 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-84cc94bbb5-fzmcp" event={"ID":"cacbfdeb-1874-4a02-bdde-c5e06143d710","Type":"ContainerStarted","Data":"787cd80929c32cb0fb5c3666dcbee5c16c8a7e2fb7ab7a89b99c68a9be7bb122"} Mar 21 03:52:00 crc kubenswrapper[4685]: I0321 03:52:00.313235 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-84cc94bbb5-fzmcp" Mar 21 03:52:00 crc kubenswrapper[4685]: I0321 03:52:00.314874 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57cdb85995-tptvs" event={"ID":"3184a56e-acae-47d4-9dd6-9ee8c0e0d924","Type":"ContainerStarted","Data":"07d2f583c1eb8c81051fdd456bc2212803029da8189f8c3e9125c4993ee72562"} Mar 21 03:52:00 crc kubenswrapper[4685]: I0321 03:52:00.314912 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57cdb85995-tptvs" event={"ID":"3184a56e-acae-47d4-9dd6-9ee8c0e0d924","Type":"ContainerStarted","Data":"b9e530abfa1410e338ab67b844dd94d751ae3044ab4352edba489642869bf0bc"} Mar 21 03:52:00 crc kubenswrapper[4685]: I0321 03:52:00.315023 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-57cdb85995-tptvs" Mar 21 03:52:00 crc kubenswrapper[4685]: I0321 03:52:00.316830 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-56cf947455-2gb7r" event={"ID":"a7133c8b-7aed-4976-a349-18af9d78c198","Type":"ContainerStarted","Data":"decbb91b269044af4f596b6c7b1d53425efc032f1b26d6a4c9faa2f4cde44a49"} Mar 21 03:52:00 crc kubenswrapper[4685]: I0321 03:52:00.317223 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-56cf947455-2gb7r" Mar 21 03:52:00 crc kubenswrapper[4685]: I0321 03:52:00.317260 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-56cf947455-2gb7r" event={"ID":"a7133c8b-7aed-4976-a349-18af9d78c198","Type":"ContainerStarted","Data":"95f3486809b912ec7feb31b995a09ccbc1ef089e64e3f9d9a457fe06465e75c5"} Mar 21 03:52:00 crc kubenswrapper[4685]: I0321 03:52:00.320173 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-84cc94bbb5-fzmcp" Mar 21 03:52:00 crc kubenswrapper[4685]: I0321 03:52:00.325505 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-57cdb85995-tptvs" Mar 21 03:52:00 crc kubenswrapper[4685]: I0321 03:52:00.351206 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-84cc94bbb5-fzmcp" podStartSLOduration=48.351184769 podStartE2EDuration="48.351184769s" podCreationTimestamp="2026-03-21 03:51:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 03:52:00.333667912 +0000 UTC m=+352.810736704" watchObservedRunningTime="2026-03-21 03:52:00.351184769 +0000 UTC m=+352.828253561" Mar 21 03:52:00 crc kubenswrapper[4685]: I0321 03:52:00.365387 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 21 03:52:00 crc kubenswrapper[4685]: I0321 03:52:00.371099 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zrvq\" (UniqueName: \"kubernetes.io/projected/795b0bef-045f-4b6d-8b0b-60b79ccbded1-kube-api-access-5zrvq\") pod \"auto-csr-approver-29567752-khgzj\" (UID: \"795b0bef-045f-4b6d-8b0b-60b79ccbded1\") " pod="openshift-infra/auto-csr-approver-29567752-khgzj" Mar 21 03:52:00 crc kubenswrapper[4685]: I0321 03:52:00.373021 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 21 03:52:00 crc kubenswrapper[4685]: I0321 03:52:00.384635 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-57cdb85995-tptvs" podStartSLOduration=48.384615036 podStartE2EDuration="48.384615036s" podCreationTimestamp="2026-03-21 03:51:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 03:52:00.382989177 +0000 UTC m=+352.860057969" watchObservedRunningTime="2026-03-21 03:52:00.384615036 +0000 UTC m=+352.861683828" Mar 21 03:52:00 crc kubenswrapper[4685]: I0321 03:52:00.394740 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zrvq\" (UniqueName: \"kubernetes.io/projected/795b0bef-045f-4b6d-8b0b-60b79ccbded1-kube-api-access-5zrvq\") pod \"auto-csr-approver-29567752-khgzj\" (UID: \"795b0bef-045f-4b6d-8b0b-60b79ccbded1\") " pod="openshift-infra/auto-csr-approver-29567752-khgzj" Mar 21 03:52:00 crc kubenswrapper[4685]: I0321 03:52:00.449387 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-56cf947455-2gb7r" podStartSLOduration=55.449363715 podStartE2EDuration="55.449363715s" podCreationTimestamp="2026-03-21 03:51:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 03:52:00.443750796 +0000 UTC m=+352.920819598" watchObservedRunningTime="2026-03-21 03:52:00.449363715 +0000 UTC m=+352.926432507" Mar 21 03:52:00 crc kubenswrapper[4685]: I0321 03:52:00.482403 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567752-khgzj" Mar 21 03:52:00 crc kubenswrapper[4685]: I0321 03:52:00.620613 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 21 03:52:00 crc kubenswrapper[4685]: I0321 03:52:00.861849 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-56cf947455-2gb7r" Mar 21 03:52:00 crc kubenswrapper[4685]: I0321 03:52:00.867219 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 21 03:52:00 crc kubenswrapper[4685]: I0321 03:52:00.871273 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567752-khgzj"] Mar 21 03:52:01 crc kubenswrapper[4685]: I0321 03:52:01.125981 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 21 03:52:01 crc kubenswrapper[4685]: I0321 03:52:01.325311 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567752-khgzj" event={"ID":"795b0bef-045f-4b6d-8b0b-60b79ccbded1","Type":"ContainerStarted","Data":"e84b25bdcdfde84fe2d615cf83235a28e7c53daa3b06092bfd7ed859fc8b59c8"} Mar 21 03:52:01 crc kubenswrapper[4685]: I0321 03:52:01.631465 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 21 03:52:01 crc kubenswrapper[4685]: I0321 03:52:01.666355 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 21 03:52:02 crc kubenswrapper[4685]: I0321 03:52:02.331136 4685 generic.go:334] "Generic (PLEG): container finished" podID="795b0bef-045f-4b6d-8b0b-60b79ccbded1" containerID="f1e28daa02ad3b1ab2ee46fac8b55ad548ffac6d7b98447347bf9ea0ae90df6d" exitCode=0 Mar 21 03:52:02 crc kubenswrapper[4685]: I0321 03:52:02.331237 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567752-khgzj" event={"ID":"795b0bef-045f-4b6d-8b0b-60b79ccbded1","Type":"ContainerDied","Data":"f1e28daa02ad3b1ab2ee46fac8b55ad548ffac6d7b98447347bf9ea0ae90df6d"} Mar 21 03:52:02 crc kubenswrapper[4685]: I0321 03:52:02.515219 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 21 03:52:02 crc kubenswrapper[4685]: I0321 03:52:02.678912 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 21 03:52:02 crc kubenswrapper[4685]: I0321 03:52:02.821294 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 21 03:52:03 crc kubenswrapper[4685]: I0321 03:52:03.800559 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567752-khgzj" Mar 21 03:52:03 crc kubenswrapper[4685]: I0321 03:52:03.923097 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zrvq\" (UniqueName: \"kubernetes.io/projected/795b0bef-045f-4b6d-8b0b-60b79ccbded1-kube-api-access-5zrvq\") pod \"795b0bef-045f-4b6d-8b0b-60b79ccbded1\" (UID: \"795b0bef-045f-4b6d-8b0b-60b79ccbded1\") " Mar 21 03:52:03 crc kubenswrapper[4685]: I0321 03:52:03.929887 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/795b0bef-045f-4b6d-8b0b-60b79ccbded1-kube-api-access-5zrvq" (OuterVolumeSpecName: "kube-api-access-5zrvq") pod "795b0bef-045f-4b6d-8b0b-60b79ccbded1" (UID: "795b0bef-045f-4b6d-8b0b-60b79ccbded1"). InnerVolumeSpecName "kube-api-access-5zrvq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:52:04 crc kubenswrapper[4685]: I0321 03:52:04.024782 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5zrvq\" (UniqueName: \"kubernetes.io/projected/795b0bef-045f-4b6d-8b0b-60b79ccbded1-kube-api-access-5zrvq\") on node \"crc\" DevicePath \"\"" Mar 21 03:52:04 crc kubenswrapper[4685]: I0321 03:52:04.344820 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567752-khgzj" event={"ID":"795b0bef-045f-4b6d-8b0b-60b79ccbded1","Type":"ContainerDied","Data":"e84b25bdcdfde84fe2d615cf83235a28e7c53daa3b06092bfd7ed859fc8b59c8"} Mar 21 03:52:04 crc kubenswrapper[4685]: I0321 03:52:04.344933 4685 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e84b25bdcdfde84fe2d615cf83235a28e7c53daa3b06092bfd7ed859fc8b59c8" Mar 21 03:52:04 crc kubenswrapper[4685]: I0321 03:52:04.344993 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567752-khgzj" Mar 21 03:52:07 crc kubenswrapper[4685]: I0321 03:52:07.781726 4685 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 21 03:52:07 crc kubenswrapper[4685]: I0321 03:52:07.781994 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://ef0e430dc19ca6925ec78663fdc64f0e2ce9c40649685826ca31d883109b727c" gracePeriod=5 Mar 21 03:52:12 crc kubenswrapper[4685]: E0321 03:52:12.927006 4685 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-conmon-ef0e430dc19ca6925ec78663fdc64f0e2ce9c40649685826ca31d883109b727c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-ef0e430dc19ca6925ec78663fdc64f0e2ce9c40649685826ca31d883109b727c.scope\": RecentStats: unable to find data in memory cache]" Mar 21 03:52:13 crc kubenswrapper[4685]: I0321 03:52:13.377734 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 21 03:52:13 crc kubenswrapper[4685]: I0321 03:52:13.377829 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 03:52:13 crc kubenswrapper[4685]: I0321 03:52:13.395922 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 21 03:52:13 crc kubenswrapper[4685]: I0321 03:52:13.396209 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 03:52:13 crc kubenswrapper[4685]: I0321 03:52:13.396243 4685 scope.go:117] "RemoveContainer" containerID="ef0e430dc19ca6925ec78663fdc64f0e2ce9c40649685826ca31d883109b727c" Mar 21 03:52:13 crc kubenswrapper[4685]: I0321 03:52:13.396667 4685 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="ef0e430dc19ca6925ec78663fdc64f0e2ce9c40649685826ca31d883109b727c" exitCode=137 Mar 21 03:52:13 crc kubenswrapper[4685]: I0321 03:52:13.415857 4685 scope.go:117] "RemoveContainer" containerID="ef0e430dc19ca6925ec78663fdc64f0e2ce9c40649685826ca31d883109b727c" Mar 21 03:52:13 crc kubenswrapper[4685]: E0321 03:52:13.416436 4685 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef0e430dc19ca6925ec78663fdc64f0e2ce9c40649685826ca31d883109b727c\": container with ID starting with ef0e430dc19ca6925ec78663fdc64f0e2ce9c40649685826ca31d883109b727c not found: ID does not exist" containerID="ef0e430dc19ca6925ec78663fdc64f0e2ce9c40649685826ca31d883109b727c" Mar 21 03:52:13 crc kubenswrapper[4685]: I0321 03:52:13.416468 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef0e430dc19ca6925ec78663fdc64f0e2ce9c40649685826ca31d883109b727c"} err="failed to get container status \"ef0e430dc19ca6925ec78663fdc64f0e2ce9c40649685826ca31d883109b727c\": rpc error: code = NotFound desc = could not find container \"ef0e430dc19ca6925ec78663fdc64f0e2ce9c40649685826ca31d883109b727c\": container with ID starting with ef0e430dc19ca6925ec78663fdc64f0e2ce9c40649685826ca31d883109b727c not found: ID does not exist" Mar 21 03:52:13 crc kubenswrapper[4685]: I0321 03:52:13.430931 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 21 03:52:13 crc kubenswrapper[4685]: I0321 03:52:13.431116 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 21 03:52:13 crc kubenswrapper[4685]: I0321 03:52:13.431245 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 21 03:52:13 crc kubenswrapper[4685]: I0321 03:52:13.431027 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 03:52:13 crc kubenswrapper[4685]: I0321 03:52:13.431359 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 03:52:13 crc kubenswrapper[4685]: I0321 03:52:13.431393 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 21 03:52:13 crc kubenswrapper[4685]: I0321 03:52:13.431480 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 21 03:52:13 crc kubenswrapper[4685]: I0321 03:52:13.431571 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 03:52:13 crc kubenswrapper[4685]: I0321 03:52:13.431696 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 03:52:13 crc kubenswrapper[4685]: I0321 03:52:13.431861 4685 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 21 03:52:13 crc kubenswrapper[4685]: I0321 03:52:13.431878 4685 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 21 03:52:13 crc kubenswrapper[4685]: I0321 03:52:13.431889 4685 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 21 03:52:13 crc kubenswrapper[4685]: I0321 03:52:13.431899 4685 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 21 03:52:13 crc kubenswrapper[4685]: I0321 03:52:13.438559 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 03:52:13 crc kubenswrapper[4685]: I0321 03:52:13.532971 4685 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 21 03:52:14 crc kubenswrapper[4685]: I0321 03:52:14.307826 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 21 03:52:14 crc kubenswrapper[4685]: I0321 03:52:14.308460 4685 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Mar 21 03:52:14 crc kubenswrapper[4685]: I0321 03:52:14.320519 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 21 03:52:14 crc kubenswrapper[4685]: I0321 03:52:14.320555 4685 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="dae53384-990e-4052-ab22-aa2ca3c4b75a" Mar 21 03:52:14 crc kubenswrapper[4685]: I0321 03:52:14.327380 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 21 03:52:14 crc kubenswrapper[4685]: I0321 03:52:14.327562 4685 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="dae53384-990e-4052-ab22-aa2ca3c4b75a" Mar 21 03:52:24 crc kubenswrapper[4685]: I0321 03:52:24.465281 4685 generic.go:334] "Generic (PLEG): container finished" podID="b0b74168-914c-4a2e-9122-c55d3bc3bcc2" containerID="8769ba1d49069d8d855eda602952eeb9e340b678969874cb89b2d49ff075670b" exitCode=0 Mar 21 03:52:24 crc kubenswrapper[4685]: I0321 03:52:24.465368 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-pcwhp" event={"ID":"b0b74168-914c-4a2e-9122-c55d3bc3bcc2","Type":"ContainerDied","Data":"8769ba1d49069d8d855eda602952eeb9e340b678969874cb89b2d49ff075670b"} Mar 21 03:52:24 crc kubenswrapper[4685]: I0321 03:52:24.466132 4685 scope.go:117] "RemoveContainer" containerID="8769ba1d49069d8d855eda602952eeb9e340b678969874cb89b2d49ff075670b" Mar 21 03:52:25 crc kubenswrapper[4685]: I0321 03:52:25.473570 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-pcwhp" event={"ID":"b0b74168-914c-4a2e-9122-c55d3bc3bcc2","Type":"ContainerStarted","Data":"633175c67129bc27f772dd44d7ed99c9055d531f9a9d88805aa434148307f2ab"} Mar 21 03:52:25 crc kubenswrapper[4685]: I0321 03:52:25.474391 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-pcwhp" Mar 21 03:52:25 crc kubenswrapper[4685]: I0321 03:52:25.476577 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-pcwhp" Mar 21 03:53:09 crc kubenswrapper[4685]: I0321 03:53:09.685210 4685 patch_prober.go:28] interesting pod/machine-config-daemon-7r9cg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 03:53:09 crc kubenswrapper[4685]: I0321 03:53:09.686767 4685 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" podUID="cea46fe2-4e41-43ab-a069-cb30fb4e732c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 03:53:30 crc kubenswrapper[4685]: I0321 03:53:30.534146 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xvjs7"] Mar 21 03:53:30 crc kubenswrapper[4685]: I0321 03:53:30.534904 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xvjs7" podUID="53ad7ff2-a7d3-4ad6-8e97-14542e4a99cd" containerName="registry-server" containerID="cri-o://4058faaca0183c0d4f767cb3a53404b38da72a2f4590534f2000518b56c87aa4" gracePeriod=30 Mar 21 03:53:30 crc kubenswrapper[4685]: I0321 03:53:30.544112 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mhc9s"] Mar 21 03:53:30 crc kubenswrapper[4685]: I0321 03:53:30.544646 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mhc9s" podUID="931ed0e7-7ffb-48ba-92b0-28883a6f0b39" containerName="registry-server" containerID="cri-o://2ce4d1c38370e6fe7108157ec26a4478d484b05528860fe73b8f7f736aadd4c6" gracePeriod=30 Mar 21 03:53:30 crc kubenswrapper[4685]: I0321 03:53:30.565330 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-pcwhp"] Mar 21 03:53:30 crc kubenswrapper[4685]: I0321 03:53:30.565604 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-pcwhp" podUID="b0b74168-914c-4a2e-9122-c55d3bc3bcc2" containerName="marketplace-operator" containerID="cri-o://633175c67129bc27f772dd44d7ed99c9055d531f9a9d88805aa434148307f2ab" gracePeriod=30 Mar 21 03:53:30 crc kubenswrapper[4685]: I0321 03:53:30.576159 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-h9g9c"] Mar 21 03:53:30 crc kubenswrapper[4685]: E0321 03:53:30.576712 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 21 03:53:30 crc kubenswrapper[4685]: I0321 03:53:30.576726 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 21 03:53:30 crc kubenswrapper[4685]: E0321 03:53:30.576736 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="795b0bef-045f-4b6d-8b0b-60b79ccbded1" containerName="oc" Mar 21 03:53:30 crc kubenswrapper[4685]: I0321 03:53:30.576743 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="795b0bef-045f-4b6d-8b0b-60b79ccbded1" containerName="oc" Mar 21 03:53:30 crc kubenswrapper[4685]: I0321 03:53:30.576895 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="795b0bef-045f-4b6d-8b0b-60b79ccbded1" containerName="oc" Mar 21 03:53:30 crc kubenswrapper[4685]: I0321 03:53:30.576919 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 21 03:53:30 crc kubenswrapper[4685]: I0321 03:53:30.577439 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-h9g9c" Mar 21 03:53:30 crc kubenswrapper[4685]: I0321 03:53:30.579625 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-s6khh"] Mar 21 03:53:30 crc kubenswrapper[4685]: I0321 03:53:30.579904 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-s6khh" podUID="9c1f4e4f-a993-423e-8922-d8b81967d483" containerName="registry-server" containerID="cri-o://b550faf13202a2d7d77ba9b28afd1da552906f0f94ea8be2f95b70e5f93e8266" gracePeriod=30 Mar 21 03:53:30 crc kubenswrapper[4685]: I0321 03:53:30.587407 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lnbh8"] Mar 21 03:53:30 crc kubenswrapper[4685]: I0321 03:53:30.587931 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-lnbh8" podUID="4d6c412c-bc35-4360-91b0-06f8b60e7106" containerName="registry-server" containerID="cri-o://bfe98363c9199450c193ea8c4bdfc54b17679ec5de3d6d5dfb44d2eb7f6aa678" gracePeriod=30 Mar 21 03:53:30 crc kubenswrapper[4685]: I0321 03:53:30.598347 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-h9g9c"] Mar 21 03:53:30 crc kubenswrapper[4685]: I0321 03:53:30.712767 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1efd0452-eb45-4336-a0eb-2e171d3da229-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-h9g9c\" (UID: \"1efd0452-eb45-4336-a0eb-2e171d3da229\") " pod="openshift-marketplace/marketplace-operator-79b997595-h9g9c" Mar 21 03:53:30 crc kubenswrapper[4685]: I0321 03:53:30.712840 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l848z\" (UniqueName: \"kubernetes.io/projected/1efd0452-eb45-4336-a0eb-2e171d3da229-kube-api-access-l848z\") pod \"marketplace-operator-79b997595-h9g9c\" (UID: \"1efd0452-eb45-4336-a0eb-2e171d3da229\") " pod="openshift-marketplace/marketplace-operator-79b997595-h9g9c" Mar 21 03:53:30 crc kubenswrapper[4685]: I0321 03:53:30.712901 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1efd0452-eb45-4336-a0eb-2e171d3da229-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-h9g9c\" (UID: \"1efd0452-eb45-4336-a0eb-2e171d3da229\") " pod="openshift-marketplace/marketplace-operator-79b997595-h9g9c" Mar 21 03:53:30 crc kubenswrapper[4685]: I0321 03:53:30.813807 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1efd0452-eb45-4336-a0eb-2e171d3da229-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-h9g9c\" (UID: \"1efd0452-eb45-4336-a0eb-2e171d3da229\") " pod="openshift-marketplace/marketplace-operator-79b997595-h9g9c" Mar 21 03:53:30 crc kubenswrapper[4685]: I0321 03:53:30.813897 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l848z\" (UniqueName: \"kubernetes.io/projected/1efd0452-eb45-4336-a0eb-2e171d3da229-kube-api-access-l848z\") pod \"marketplace-operator-79b997595-h9g9c\" (UID: \"1efd0452-eb45-4336-a0eb-2e171d3da229\") " pod="openshift-marketplace/marketplace-operator-79b997595-h9g9c" Mar 21 03:53:30 crc kubenswrapper[4685]: I0321 03:53:30.813930 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1efd0452-eb45-4336-a0eb-2e171d3da229-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-h9g9c\" (UID: \"1efd0452-eb45-4336-a0eb-2e171d3da229\") " pod="openshift-marketplace/marketplace-operator-79b997595-h9g9c" Mar 21 03:53:30 crc kubenswrapper[4685]: I0321 03:53:30.815194 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1efd0452-eb45-4336-a0eb-2e171d3da229-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-h9g9c\" (UID: \"1efd0452-eb45-4336-a0eb-2e171d3da229\") " pod="openshift-marketplace/marketplace-operator-79b997595-h9g9c" Mar 21 03:53:30 crc kubenswrapper[4685]: I0321 03:53:30.819819 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1efd0452-eb45-4336-a0eb-2e171d3da229-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-h9g9c\" (UID: \"1efd0452-eb45-4336-a0eb-2e171d3da229\") " pod="openshift-marketplace/marketplace-operator-79b997595-h9g9c" Mar 21 03:53:30 crc kubenswrapper[4685]: I0321 03:53:30.832675 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l848z\" (UniqueName: \"kubernetes.io/projected/1efd0452-eb45-4336-a0eb-2e171d3da229-kube-api-access-l848z\") pod \"marketplace-operator-79b997595-h9g9c\" (UID: \"1efd0452-eb45-4336-a0eb-2e171d3da229\") " pod="openshift-marketplace/marketplace-operator-79b997595-h9g9c" Mar 21 03:53:30 crc kubenswrapper[4685]: I0321 03:53:30.841852 4685 generic.go:334] "Generic (PLEG): container finished" podID="b0b74168-914c-4a2e-9122-c55d3bc3bcc2" containerID="633175c67129bc27f772dd44d7ed99c9055d531f9a9d88805aa434148307f2ab" exitCode=0 Mar 21 03:53:30 crc kubenswrapper[4685]: I0321 03:53:30.841966 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-pcwhp" event={"ID":"b0b74168-914c-4a2e-9122-c55d3bc3bcc2","Type":"ContainerDied","Data":"633175c67129bc27f772dd44d7ed99c9055d531f9a9d88805aa434148307f2ab"} Mar 21 03:53:30 crc kubenswrapper[4685]: I0321 03:53:30.842054 4685 scope.go:117] "RemoveContainer" containerID="8769ba1d49069d8d855eda602952eeb9e340b678969874cb89b2d49ff075670b" Mar 21 03:53:30 crc kubenswrapper[4685]: I0321 03:53:30.846340 4685 generic.go:334] "Generic (PLEG): container finished" podID="931ed0e7-7ffb-48ba-92b0-28883a6f0b39" containerID="2ce4d1c38370e6fe7108157ec26a4478d484b05528860fe73b8f7f736aadd4c6" exitCode=0 Mar 21 03:53:30 crc kubenswrapper[4685]: I0321 03:53:30.846395 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mhc9s" event={"ID":"931ed0e7-7ffb-48ba-92b0-28883a6f0b39","Type":"ContainerDied","Data":"2ce4d1c38370e6fe7108157ec26a4478d484b05528860fe73b8f7f736aadd4c6"} Mar 21 03:53:30 crc kubenswrapper[4685]: I0321 03:53:30.850610 4685 generic.go:334] "Generic (PLEG): container finished" podID="9c1f4e4f-a993-423e-8922-d8b81967d483" containerID="b550faf13202a2d7d77ba9b28afd1da552906f0f94ea8be2f95b70e5f93e8266" exitCode=0 Mar 21 03:53:30 crc kubenswrapper[4685]: I0321 03:53:30.850665 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s6khh" event={"ID":"9c1f4e4f-a993-423e-8922-d8b81967d483","Type":"ContainerDied","Data":"b550faf13202a2d7d77ba9b28afd1da552906f0f94ea8be2f95b70e5f93e8266"} Mar 21 03:53:30 crc kubenswrapper[4685]: I0321 03:53:30.855379 4685 generic.go:334] "Generic (PLEG): container finished" podID="53ad7ff2-a7d3-4ad6-8e97-14542e4a99cd" containerID="4058faaca0183c0d4f767cb3a53404b38da72a2f4590534f2000518b56c87aa4" exitCode=0 Mar 21 03:53:30 crc kubenswrapper[4685]: I0321 03:53:30.855437 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xvjs7" event={"ID":"53ad7ff2-a7d3-4ad6-8e97-14542e4a99cd","Type":"ContainerDied","Data":"4058faaca0183c0d4f767cb3a53404b38da72a2f4590534f2000518b56c87aa4"} Mar 21 03:53:30 crc kubenswrapper[4685]: I0321 03:53:30.858143 4685 generic.go:334] "Generic (PLEG): container finished" podID="4d6c412c-bc35-4360-91b0-06f8b60e7106" containerID="bfe98363c9199450c193ea8c4bdfc54b17679ec5de3d6d5dfb44d2eb7f6aa678" exitCode=0 Mar 21 03:53:30 crc kubenswrapper[4685]: I0321 03:53:30.858169 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lnbh8" event={"ID":"4d6c412c-bc35-4360-91b0-06f8b60e7106","Type":"ContainerDied","Data":"bfe98363c9199450c193ea8c4bdfc54b17679ec5de3d6d5dfb44d2eb7f6aa678"} Mar 21 03:53:30 crc kubenswrapper[4685]: I0321 03:53:30.876692 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-h9g9c" Mar 21 03:53:30 crc kubenswrapper[4685]: I0321 03:53:30.917983 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xvjs7" Mar 21 03:53:31 crc kubenswrapper[4685]: I0321 03:53:31.015324 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53ad7ff2-a7d3-4ad6-8e97-14542e4a99cd-utilities\") pod \"53ad7ff2-a7d3-4ad6-8e97-14542e4a99cd\" (UID: \"53ad7ff2-a7d3-4ad6-8e97-14542e4a99cd\") " Mar 21 03:53:31 crc kubenswrapper[4685]: I0321 03:53:31.015648 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53ad7ff2-a7d3-4ad6-8e97-14542e4a99cd-catalog-content\") pod \"53ad7ff2-a7d3-4ad6-8e97-14542e4a99cd\" (UID: \"53ad7ff2-a7d3-4ad6-8e97-14542e4a99cd\") " Mar 21 03:53:31 crc kubenswrapper[4685]: I0321 03:53:31.015675 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27mn8\" (UniqueName: \"kubernetes.io/projected/53ad7ff2-a7d3-4ad6-8e97-14542e4a99cd-kube-api-access-27mn8\") pod \"53ad7ff2-a7d3-4ad6-8e97-14542e4a99cd\" (UID: \"53ad7ff2-a7d3-4ad6-8e97-14542e4a99cd\") " Mar 21 03:53:31 crc kubenswrapper[4685]: I0321 03:53:31.017229 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53ad7ff2-a7d3-4ad6-8e97-14542e4a99cd-utilities" (OuterVolumeSpecName: "utilities") pod "53ad7ff2-a7d3-4ad6-8e97-14542e4a99cd" (UID: "53ad7ff2-a7d3-4ad6-8e97-14542e4a99cd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 03:53:31 crc kubenswrapper[4685]: I0321 03:53:31.031768 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53ad7ff2-a7d3-4ad6-8e97-14542e4a99cd-kube-api-access-27mn8" (OuterVolumeSpecName: "kube-api-access-27mn8") pod "53ad7ff2-a7d3-4ad6-8e97-14542e4a99cd" (UID: "53ad7ff2-a7d3-4ad6-8e97-14542e4a99cd"). InnerVolumeSpecName "kube-api-access-27mn8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:53:31 crc kubenswrapper[4685]: I0321 03:53:31.049410 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s6khh" Mar 21 03:53:31 crc kubenswrapper[4685]: I0321 03:53:31.053370 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lnbh8" Mar 21 03:53:31 crc kubenswrapper[4685]: I0321 03:53:31.078757 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53ad7ff2-a7d3-4ad6-8e97-14542e4a99cd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "53ad7ff2-a7d3-4ad6-8e97-14542e4a99cd" (UID: "53ad7ff2-a7d3-4ad6-8e97-14542e4a99cd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 03:53:31 crc kubenswrapper[4685]: I0321 03:53:31.081595 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-pcwhp" Mar 21 03:53:31 crc kubenswrapper[4685]: I0321 03:53:31.081976 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mhc9s" Mar 21 03:53:31 crc kubenswrapper[4685]: I0321 03:53:31.116712 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-px57g\" (UniqueName: \"kubernetes.io/projected/9c1f4e4f-a993-423e-8922-d8b81967d483-kube-api-access-px57g\") pod \"9c1f4e4f-a993-423e-8922-d8b81967d483\" (UID: \"9c1f4e4f-a993-423e-8922-d8b81967d483\") " Mar 21 03:53:31 crc kubenswrapper[4685]: I0321 03:53:31.116806 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wm8t5\" (UniqueName: \"kubernetes.io/projected/4d6c412c-bc35-4360-91b0-06f8b60e7106-kube-api-access-wm8t5\") pod \"4d6c412c-bc35-4360-91b0-06f8b60e7106\" (UID: \"4d6c412c-bc35-4360-91b0-06f8b60e7106\") " Mar 21 03:53:31 crc kubenswrapper[4685]: I0321 03:53:31.116887 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c1f4e4f-a993-423e-8922-d8b81967d483-utilities\") pod \"9c1f4e4f-a993-423e-8922-d8b81967d483\" (UID: \"9c1f4e4f-a993-423e-8922-d8b81967d483\") " Mar 21 03:53:31 crc kubenswrapper[4685]: I0321 03:53:31.116938 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d6c412c-bc35-4360-91b0-06f8b60e7106-catalog-content\") pod \"4d6c412c-bc35-4360-91b0-06f8b60e7106\" (UID: \"4d6c412c-bc35-4360-91b0-06f8b60e7106\") " Mar 21 03:53:31 crc kubenswrapper[4685]: I0321 03:53:31.116953 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d6c412c-bc35-4360-91b0-06f8b60e7106-utilities\") pod \"4d6c412c-bc35-4360-91b0-06f8b60e7106\" (UID: \"4d6c412c-bc35-4360-91b0-06f8b60e7106\") " Mar 21 03:53:31 crc kubenswrapper[4685]: I0321 03:53:31.116974 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c1f4e4f-a993-423e-8922-d8b81967d483-catalog-content\") pod \"9c1f4e4f-a993-423e-8922-d8b81967d483\" (UID: \"9c1f4e4f-a993-423e-8922-d8b81967d483\") " Mar 21 03:53:31 crc kubenswrapper[4685]: I0321 03:53:31.117204 4685 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53ad7ff2-a7d3-4ad6-8e97-14542e4a99cd-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 03:53:31 crc kubenswrapper[4685]: I0321 03:53:31.117216 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27mn8\" (UniqueName: \"kubernetes.io/projected/53ad7ff2-a7d3-4ad6-8e97-14542e4a99cd-kube-api-access-27mn8\") on node \"crc\" DevicePath \"\"" Mar 21 03:53:31 crc kubenswrapper[4685]: I0321 03:53:31.117226 4685 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53ad7ff2-a7d3-4ad6-8e97-14542e4a99cd-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 03:53:31 crc kubenswrapper[4685]: I0321 03:53:31.119073 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c1f4e4f-a993-423e-8922-d8b81967d483-utilities" (OuterVolumeSpecName: "utilities") pod "9c1f4e4f-a993-423e-8922-d8b81967d483" (UID: "9c1f4e4f-a993-423e-8922-d8b81967d483"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 03:53:31 crc kubenswrapper[4685]: I0321 03:53:31.121196 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c1f4e4f-a993-423e-8922-d8b81967d483-kube-api-access-px57g" (OuterVolumeSpecName: "kube-api-access-px57g") pod "9c1f4e4f-a993-423e-8922-d8b81967d483" (UID: "9c1f4e4f-a993-423e-8922-d8b81967d483"). InnerVolumeSpecName "kube-api-access-px57g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:53:31 crc kubenswrapper[4685]: I0321 03:53:31.123058 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d6c412c-bc35-4360-91b0-06f8b60e7106-kube-api-access-wm8t5" (OuterVolumeSpecName: "kube-api-access-wm8t5") pod "4d6c412c-bc35-4360-91b0-06f8b60e7106" (UID: "4d6c412c-bc35-4360-91b0-06f8b60e7106"). InnerVolumeSpecName "kube-api-access-wm8t5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:53:31 crc kubenswrapper[4685]: I0321 03:53:31.124936 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d6c412c-bc35-4360-91b0-06f8b60e7106-utilities" (OuterVolumeSpecName: "utilities") pod "4d6c412c-bc35-4360-91b0-06f8b60e7106" (UID: "4d6c412c-bc35-4360-91b0-06f8b60e7106"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 03:53:31 crc kubenswrapper[4685]: I0321 03:53:31.142256 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-h9g9c"] Mar 21 03:53:31 crc kubenswrapper[4685]: W0321 03:53:31.147024 4685 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1efd0452_eb45_4336_a0eb_2e171d3da229.slice/crio-59b65926ab6e6c723c09207cc3229378af24a28e37941af2f205ba6b4b2e7e51 WatchSource:0}: Error finding container 59b65926ab6e6c723c09207cc3229378af24a28e37941af2f205ba6b4b2e7e51: Status 404 returned error can't find the container with id 59b65926ab6e6c723c09207cc3229378af24a28e37941af2f205ba6b4b2e7e51 Mar 21 03:53:31 crc kubenswrapper[4685]: I0321 03:53:31.161520 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c1f4e4f-a993-423e-8922-d8b81967d483-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9c1f4e4f-a993-423e-8922-d8b81967d483" (UID: "9c1f4e4f-a993-423e-8922-d8b81967d483"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 03:53:31 crc kubenswrapper[4685]: I0321 03:53:31.218412 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/931ed0e7-7ffb-48ba-92b0-28883a6f0b39-utilities\") pod \"931ed0e7-7ffb-48ba-92b0-28883a6f0b39\" (UID: \"931ed0e7-7ffb-48ba-92b0-28883a6f0b39\") " Mar 21 03:53:31 crc kubenswrapper[4685]: I0321 03:53:31.218474 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b0b74168-914c-4a2e-9122-c55d3bc3bcc2-marketplace-trusted-ca\") pod \"b0b74168-914c-4a2e-9122-c55d3bc3bcc2\" (UID: \"b0b74168-914c-4a2e-9122-c55d3bc3bcc2\") " Mar 21 03:53:31 crc kubenswrapper[4685]: I0321 03:53:31.218515 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/931ed0e7-7ffb-48ba-92b0-28883a6f0b39-catalog-content\") pod \"931ed0e7-7ffb-48ba-92b0-28883a6f0b39\" (UID: \"931ed0e7-7ffb-48ba-92b0-28883a6f0b39\") " Mar 21 03:53:31 crc kubenswrapper[4685]: I0321 03:53:31.218549 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtx65\" (UniqueName: \"kubernetes.io/projected/931ed0e7-7ffb-48ba-92b0-28883a6f0b39-kube-api-access-jtx65\") pod \"931ed0e7-7ffb-48ba-92b0-28883a6f0b39\" (UID: \"931ed0e7-7ffb-48ba-92b0-28883a6f0b39\") " Mar 21 03:53:31 crc kubenswrapper[4685]: I0321 03:53:31.218567 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b0b74168-914c-4a2e-9122-c55d3bc3bcc2-marketplace-operator-metrics\") pod \"b0b74168-914c-4a2e-9122-c55d3bc3bcc2\" (UID: \"b0b74168-914c-4a2e-9122-c55d3bc3bcc2\") " Mar 21 03:53:31 crc kubenswrapper[4685]: I0321 03:53:31.218587 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hkt2m\" (UniqueName: \"kubernetes.io/projected/b0b74168-914c-4a2e-9122-c55d3bc3bcc2-kube-api-access-hkt2m\") pod \"b0b74168-914c-4a2e-9122-c55d3bc3bcc2\" (UID: \"b0b74168-914c-4a2e-9122-c55d3bc3bcc2\") " Mar 21 03:53:31 crc kubenswrapper[4685]: I0321 03:53:31.218772 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-px57g\" (UniqueName: \"kubernetes.io/projected/9c1f4e4f-a993-423e-8922-d8b81967d483-kube-api-access-px57g\") on node \"crc\" DevicePath \"\"" Mar 21 03:53:31 crc kubenswrapper[4685]: I0321 03:53:31.218784 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wm8t5\" (UniqueName: \"kubernetes.io/projected/4d6c412c-bc35-4360-91b0-06f8b60e7106-kube-api-access-wm8t5\") on node \"crc\" DevicePath \"\"" Mar 21 03:53:31 crc kubenswrapper[4685]: I0321 03:53:31.218805 4685 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c1f4e4f-a993-423e-8922-d8b81967d483-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 03:53:31 crc kubenswrapper[4685]: I0321 03:53:31.218813 4685 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d6c412c-bc35-4360-91b0-06f8b60e7106-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 03:53:31 crc kubenswrapper[4685]: I0321 03:53:31.218822 4685 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c1f4e4f-a993-423e-8922-d8b81967d483-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 03:53:31 crc kubenswrapper[4685]: I0321 03:53:31.220165 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/931ed0e7-7ffb-48ba-92b0-28883a6f0b39-utilities" (OuterVolumeSpecName: "utilities") pod "931ed0e7-7ffb-48ba-92b0-28883a6f0b39" (UID: "931ed0e7-7ffb-48ba-92b0-28883a6f0b39"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 03:53:31 crc kubenswrapper[4685]: I0321 03:53:31.220184 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0b74168-914c-4a2e-9122-c55d3bc3bcc2-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b0b74168-914c-4a2e-9122-c55d3bc3bcc2" (UID: "b0b74168-914c-4a2e-9122-c55d3bc3bcc2"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:53:31 crc kubenswrapper[4685]: I0321 03:53:31.222799 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0b74168-914c-4a2e-9122-c55d3bc3bcc2-kube-api-access-hkt2m" (OuterVolumeSpecName: "kube-api-access-hkt2m") pod "b0b74168-914c-4a2e-9122-c55d3bc3bcc2" (UID: "b0b74168-914c-4a2e-9122-c55d3bc3bcc2"). InnerVolumeSpecName "kube-api-access-hkt2m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:53:31 crc kubenswrapper[4685]: I0321 03:53:31.224086 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/931ed0e7-7ffb-48ba-92b0-28883a6f0b39-kube-api-access-jtx65" (OuterVolumeSpecName: "kube-api-access-jtx65") pod "931ed0e7-7ffb-48ba-92b0-28883a6f0b39" (UID: "931ed0e7-7ffb-48ba-92b0-28883a6f0b39"). InnerVolumeSpecName "kube-api-access-jtx65". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:53:31 crc kubenswrapper[4685]: I0321 03:53:31.225770 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0b74168-914c-4a2e-9122-c55d3bc3bcc2-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b0b74168-914c-4a2e-9122-c55d3bc3bcc2" (UID: "b0b74168-914c-4a2e-9122-c55d3bc3bcc2"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 03:53:31 crc kubenswrapper[4685]: I0321 03:53:31.266660 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d6c412c-bc35-4360-91b0-06f8b60e7106-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4d6c412c-bc35-4360-91b0-06f8b60e7106" (UID: "4d6c412c-bc35-4360-91b0-06f8b60e7106"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 03:53:31 crc kubenswrapper[4685]: I0321 03:53:31.270764 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/931ed0e7-7ffb-48ba-92b0-28883a6f0b39-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "931ed0e7-7ffb-48ba-92b0-28883a6f0b39" (UID: "931ed0e7-7ffb-48ba-92b0-28883a6f0b39"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 03:53:31 crc kubenswrapper[4685]: I0321 03:53:31.319694 4685 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/931ed0e7-7ffb-48ba-92b0-28883a6f0b39-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 03:53:31 crc kubenswrapper[4685]: I0321 03:53:31.319718 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jtx65\" (UniqueName: \"kubernetes.io/projected/931ed0e7-7ffb-48ba-92b0-28883a6f0b39-kube-api-access-jtx65\") on node \"crc\" DevicePath \"\"" Mar 21 03:53:31 crc kubenswrapper[4685]: I0321 03:53:31.319729 4685 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b0b74168-914c-4a2e-9122-c55d3bc3bcc2-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 21 03:53:31 crc kubenswrapper[4685]: I0321 03:53:31.319737 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hkt2m\" (UniqueName: \"kubernetes.io/projected/b0b74168-914c-4a2e-9122-c55d3bc3bcc2-kube-api-access-hkt2m\") on node \"crc\" DevicePath \"\"" Mar 21 03:53:31 crc kubenswrapper[4685]: I0321 03:53:31.319746 4685 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d6c412c-bc35-4360-91b0-06f8b60e7106-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 03:53:31 crc kubenswrapper[4685]: I0321 03:53:31.319755 4685 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/931ed0e7-7ffb-48ba-92b0-28883a6f0b39-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 03:53:31 crc kubenswrapper[4685]: I0321 03:53:31.319763 4685 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b0b74168-914c-4a2e-9122-c55d3bc3bcc2-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 21 03:53:31 crc kubenswrapper[4685]: I0321 03:53:31.865379 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mhc9s" event={"ID":"931ed0e7-7ffb-48ba-92b0-28883a6f0b39","Type":"ContainerDied","Data":"1859afe92c3043eba35a35a14ff3ee7043637b608c0e4d98d6e7be5830f58b89"} Mar 21 03:53:31 crc kubenswrapper[4685]: I0321 03:53:31.865718 4685 scope.go:117] "RemoveContainer" containerID="2ce4d1c38370e6fe7108157ec26a4478d484b05528860fe73b8f7f736aadd4c6" Mar 21 03:53:31 crc kubenswrapper[4685]: I0321 03:53:31.865450 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mhc9s" Mar 21 03:53:31 crc kubenswrapper[4685]: I0321 03:53:31.868482 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s6khh" Mar 21 03:53:31 crc kubenswrapper[4685]: I0321 03:53:31.868508 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s6khh" event={"ID":"9c1f4e4f-a993-423e-8922-d8b81967d483","Type":"ContainerDied","Data":"b73d0a9e450c9d0b42d69d2692d9c06def486d16557e10ec98273207144ee2d9"} Mar 21 03:53:31 crc kubenswrapper[4685]: I0321 03:53:31.872264 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xvjs7" Mar 21 03:53:31 crc kubenswrapper[4685]: I0321 03:53:31.872308 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xvjs7" event={"ID":"53ad7ff2-a7d3-4ad6-8e97-14542e4a99cd","Type":"ContainerDied","Data":"c712b4fea9a21c3aceff31bbcdd94562437e64de7d6cd12c4e3c1c4fe3c45589"} Mar 21 03:53:31 crc kubenswrapper[4685]: I0321 03:53:31.873163 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-h9g9c" event={"ID":"1efd0452-eb45-4336-a0eb-2e171d3da229","Type":"ContainerStarted","Data":"d74fae6b4cdc946588a27bf59219e10d7b8ae6e7bdf5e6b720258595d7a6c609"} Mar 21 03:53:31 crc kubenswrapper[4685]: I0321 03:53:31.873190 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-h9g9c" event={"ID":"1efd0452-eb45-4336-a0eb-2e171d3da229","Type":"ContainerStarted","Data":"59b65926ab6e6c723c09207cc3229378af24a28e37941af2f205ba6b4b2e7e51"} Mar 21 03:53:31 crc kubenswrapper[4685]: I0321 03:53:31.874058 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-h9g9c" Mar 21 03:53:31 crc kubenswrapper[4685]: I0321 03:53:31.878244 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lnbh8" event={"ID":"4d6c412c-bc35-4360-91b0-06f8b60e7106","Type":"ContainerDied","Data":"9e5b9e6ab7081e16fb1172c0d1f9dee7bde06a7cd002e2c62ddefbefc368a5f3"} Mar 21 03:53:31 crc kubenswrapper[4685]: I0321 03:53:31.878359 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lnbh8" Mar 21 03:53:31 crc kubenswrapper[4685]: I0321 03:53:31.880188 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-h9g9c" Mar 21 03:53:31 crc kubenswrapper[4685]: I0321 03:53:31.882301 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-pcwhp" event={"ID":"b0b74168-914c-4a2e-9122-c55d3bc3bcc2","Type":"ContainerDied","Data":"3e485c3e61eb11f68474a81f868186ce97eaccf2ea2e9b951dc92852fd082d7a"} Mar 21 03:53:31 crc kubenswrapper[4685]: I0321 03:53:31.882418 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-pcwhp" Mar 21 03:53:31 crc kubenswrapper[4685]: I0321 03:53:31.895055 4685 scope.go:117] "RemoveContainer" containerID="0c95c902d01048b665b6b0be40dc6700543f1adc2471a719ebd801b93210b6dc" Mar 21 03:53:31 crc kubenswrapper[4685]: I0321 03:53:31.907367 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-h9g9c" podStartSLOduration=1.907329117 podStartE2EDuration="1.907329117s" podCreationTimestamp="2026-03-21 03:53:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 03:53:31.902979306 +0000 UTC m=+444.380048098" watchObservedRunningTime="2026-03-21 03:53:31.907329117 +0000 UTC m=+444.384397909" Mar 21 03:53:31 crc kubenswrapper[4685]: I0321 03:53:31.931556 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-s6khh"] Mar 21 03:53:31 crc kubenswrapper[4685]: I0321 03:53:31.940616 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-s6khh"] Mar 21 03:53:31 crc kubenswrapper[4685]: I0321 03:53:31.945010 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mhc9s"] Mar 21 03:53:31 crc kubenswrapper[4685]: I0321 03:53:31.947694 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mhc9s"] Mar 21 03:53:31 crc kubenswrapper[4685]: I0321 03:53:31.951965 4685 scope.go:117] "RemoveContainer" containerID="2d30a009615dd8540d7c08261ee0942dc7adaa78396509206fadc21d0befa89d" Mar 21 03:53:31 crc kubenswrapper[4685]: I0321 03:53:31.971375 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-pcwhp"] Mar 21 03:53:31 crc kubenswrapper[4685]: I0321 03:53:31.986003 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-pcwhp"] Mar 21 03:53:32 crc kubenswrapper[4685]: I0321 03:53:32.002591 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xvjs7"] Mar 21 03:53:32 crc kubenswrapper[4685]: I0321 03:53:32.002674 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xvjs7"] Mar 21 03:53:32 crc kubenswrapper[4685]: I0321 03:53:32.008006 4685 scope.go:117] "RemoveContainer" containerID="b550faf13202a2d7d77ba9b28afd1da552906f0f94ea8be2f95b70e5f93e8266" Mar 21 03:53:32 crc kubenswrapper[4685]: I0321 03:53:32.011497 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lnbh8"] Mar 21 03:53:32 crc kubenswrapper[4685]: I0321 03:53:32.014890 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-lnbh8"] Mar 21 03:53:32 crc kubenswrapper[4685]: I0321 03:53:32.021299 4685 scope.go:117] "RemoveContainer" containerID="d1512e8f3dde8ccc1e35f43bb7da1f2aa36a9b112b74febd6ec320b3dc4cf06d" Mar 21 03:53:32 crc kubenswrapper[4685]: I0321 03:53:32.039516 4685 scope.go:117] "RemoveContainer" containerID="6ecb8492ed241704c3e9700021f409ab933f5a648a7549ec3abda63dab206bf8" Mar 21 03:53:32 crc kubenswrapper[4685]: I0321 03:53:32.055409 4685 scope.go:117] "RemoveContainer" containerID="4058faaca0183c0d4f767cb3a53404b38da72a2f4590534f2000518b56c87aa4" Mar 21 03:53:32 crc kubenswrapper[4685]: I0321 03:53:32.076439 4685 scope.go:117] "RemoveContainer" containerID="da61e0b016b7a40b491b616c4845382c3d01209a10c6acc87a24c0afcf8d5428" Mar 21 03:53:32 crc kubenswrapper[4685]: I0321 03:53:32.088594 4685 scope.go:117] "RemoveContainer" containerID="0d275f816389391c63c170323b6c6b30d2491a63ef326c02e04fa5d6eee4b719" Mar 21 03:53:32 crc kubenswrapper[4685]: I0321 03:53:32.099409 4685 scope.go:117] "RemoveContainer" containerID="bfe98363c9199450c193ea8c4bdfc54b17679ec5de3d6d5dfb44d2eb7f6aa678" Mar 21 03:53:32 crc kubenswrapper[4685]: I0321 03:53:32.109929 4685 scope.go:117] "RemoveContainer" containerID="f7816f09862dfe2024ef340f42986a62c4e444dc50da83fe6d3101a439d9efca" Mar 21 03:53:32 crc kubenswrapper[4685]: I0321 03:53:32.123468 4685 scope.go:117] "RemoveContainer" containerID="6d88af22f84c63a6294a89516a0484847f94f27b741e4b1dac713f35487b7d05" Mar 21 03:53:32 crc kubenswrapper[4685]: I0321 03:53:32.134587 4685 scope.go:117] "RemoveContainer" containerID="633175c67129bc27f772dd44d7ed99c9055d531f9a9d88805aa434148307f2ab" Mar 21 03:53:32 crc kubenswrapper[4685]: I0321 03:53:32.307150 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d6c412c-bc35-4360-91b0-06f8b60e7106" path="/var/lib/kubelet/pods/4d6c412c-bc35-4360-91b0-06f8b60e7106/volumes" Mar 21 03:53:32 crc kubenswrapper[4685]: I0321 03:53:32.308389 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53ad7ff2-a7d3-4ad6-8e97-14542e4a99cd" path="/var/lib/kubelet/pods/53ad7ff2-a7d3-4ad6-8e97-14542e4a99cd/volumes" Mar 21 03:53:32 crc kubenswrapper[4685]: I0321 03:53:32.309300 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="931ed0e7-7ffb-48ba-92b0-28883a6f0b39" path="/var/lib/kubelet/pods/931ed0e7-7ffb-48ba-92b0-28883a6f0b39/volumes" Mar 21 03:53:32 crc kubenswrapper[4685]: I0321 03:53:32.310555 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c1f4e4f-a993-423e-8922-d8b81967d483" path="/var/lib/kubelet/pods/9c1f4e4f-a993-423e-8922-d8b81967d483/volumes" Mar 21 03:53:32 crc kubenswrapper[4685]: I0321 03:53:32.311367 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0b74168-914c-4a2e-9122-c55d3bc3bcc2" path="/var/lib/kubelet/pods/b0b74168-914c-4a2e-9122-c55d3bc3bcc2/volumes" Mar 21 03:53:32 crc kubenswrapper[4685]: I0321 03:53:32.876433 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gn69w"] Mar 21 03:53:32 crc kubenswrapper[4685]: E0321 03:53:32.876655 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0b74168-914c-4a2e-9122-c55d3bc3bcc2" containerName="marketplace-operator" Mar 21 03:53:32 crc kubenswrapper[4685]: I0321 03:53:32.876672 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0b74168-914c-4a2e-9122-c55d3bc3bcc2" containerName="marketplace-operator" Mar 21 03:53:32 crc kubenswrapper[4685]: E0321 03:53:32.876682 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c1f4e4f-a993-423e-8922-d8b81967d483" containerName="registry-server" Mar 21 03:53:32 crc kubenswrapper[4685]: I0321 03:53:32.876690 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c1f4e4f-a993-423e-8922-d8b81967d483" containerName="registry-server" Mar 21 03:53:32 crc kubenswrapper[4685]: E0321 03:53:32.876704 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c1f4e4f-a993-423e-8922-d8b81967d483" containerName="extract-utilities" Mar 21 03:53:32 crc kubenswrapper[4685]: I0321 03:53:32.876712 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c1f4e4f-a993-423e-8922-d8b81967d483" containerName="extract-utilities" Mar 21 03:53:32 crc kubenswrapper[4685]: E0321 03:53:32.876721 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53ad7ff2-a7d3-4ad6-8e97-14542e4a99cd" containerName="extract-utilities" Mar 21 03:53:32 crc kubenswrapper[4685]: I0321 03:53:32.876728 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="53ad7ff2-a7d3-4ad6-8e97-14542e4a99cd" containerName="extract-utilities" Mar 21 03:53:32 crc kubenswrapper[4685]: E0321 03:53:32.876737 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="931ed0e7-7ffb-48ba-92b0-28883a6f0b39" containerName="extract-utilities" Mar 21 03:53:32 crc kubenswrapper[4685]: I0321 03:53:32.876746 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="931ed0e7-7ffb-48ba-92b0-28883a6f0b39" containerName="extract-utilities" Mar 21 03:53:32 crc kubenswrapper[4685]: E0321 03:53:32.876755 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0b74168-914c-4a2e-9122-c55d3bc3bcc2" containerName="marketplace-operator" Mar 21 03:53:32 crc kubenswrapper[4685]: I0321 03:53:32.876762 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0b74168-914c-4a2e-9122-c55d3bc3bcc2" containerName="marketplace-operator" Mar 21 03:53:32 crc kubenswrapper[4685]: E0321 03:53:32.876772 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="931ed0e7-7ffb-48ba-92b0-28883a6f0b39" containerName="registry-server" Mar 21 03:53:32 crc kubenswrapper[4685]: I0321 03:53:32.876779 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="931ed0e7-7ffb-48ba-92b0-28883a6f0b39" containerName="registry-server" Mar 21 03:53:32 crc kubenswrapper[4685]: E0321 03:53:32.876793 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="931ed0e7-7ffb-48ba-92b0-28883a6f0b39" containerName="extract-content" Mar 21 03:53:32 crc kubenswrapper[4685]: I0321 03:53:32.876800 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="931ed0e7-7ffb-48ba-92b0-28883a6f0b39" containerName="extract-content" Mar 21 03:53:32 crc kubenswrapper[4685]: E0321 03:53:32.876812 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d6c412c-bc35-4360-91b0-06f8b60e7106" containerName="extract-content" Mar 21 03:53:32 crc kubenswrapper[4685]: I0321 03:53:32.876820 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d6c412c-bc35-4360-91b0-06f8b60e7106" containerName="extract-content" Mar 21 03:53:32 crc kubenswrapper[4685]: E0321 03:53:32.876829 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53ad7ff2-a7d3-4ad6-8e97-14542e4a99cd" containerName="extract-content" Mar 21 03:53:32 crc kubenswrapper[4685]: I0321 03:53:32.876840 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="53ad7ff2-a7d3-4ad6-8e97-14542e4a99cd" containerName="extract-content" Mar 21 03:53:32 crc kubenswrapper[4685]: E0321 03:53:32.876871 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d6c412c-bc35-4360-91b0-06f8b60e7106" containerName="registry-server" Mar 21 03:53:32 crc kubenswrapper[4685]: I0321 03:53:32.876878 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d6c412c-bc35-4360-91b0-06f8b60e7106" containerName="registry-server" Mar 21 03:53:32 crc kubenswrapper[4685]: E0321 03:53:32.876887 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d6c412c-bc35-4360-91b0-06f8b60e7106" containerName="extract-utilities" Mar 21 03:53:32 crc kubenswrapper[4685]: I0321 03:53:32.876893 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d6c412c-bc35-4360-91b0-06f8b60e7106" containerName="extract-utilities" Mar 21 03:53:32 crc kubenswrapper[4685]: E0321 03:53:32.876900 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c1f4e4f-a993-423e-8922-d8b81967d483" containerName="extract-content" Mar 21 03:53:32 crc kubenswrapper[4685]: I0321 03:53:32.876906 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c1f4e4f-a993-423e-8922-d8b81967d483" containerName="extract-content" Mar 21 03:53:32 crc kubenswrapper[4685]: E0321 03:53:32.876913 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53ad7ff2-a7d3-4ad6-8e97-14542e4a99cd" containerName="registry-server" Mar 21 03:53:32 crc kubenswrapper[4685]: I0321 03:53:32.876918 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="53ad7ff2-a7d3-4ad6-8e97-14542e4a99cd" containerName="registry-server" Mar 21 03:53:32 crc kubenswrapper[4685]: I0321 03:53:32.877000 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c1f4e4f-a993-423e-8922-d8b81967d483" containerName="registry-server" Mar 21 03:53:32 crc kubenswrapper[4685]: I0321 03:53:32.877007 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="931ed0e7-7ffb-48ba-92b0-28883a6f0b39" containerName="registry-server" Mar 21 03:53:32 crc kubenswrapper[4685]: I0321 03:53:32.877017 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d6c412c-bc35-4360-91b0-06f8b60e7106" containerName="registry-server" Mar 21 03:53:32 crc kubenswrapper[4685]: I0321 03:53:32.877023 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0b74168-914c-4a2e-9122-c55d3bc3bcc2" containerName="marketplace-operator" Mar 21 03:53:32 crc kubenswrapper[4685]: I0321 03:53:32.877033 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0b74168-914c-4a2e-9122-c55d3bc3bcc2" containerName="marketplace-operator" Mar 21 03:53:32 crc kubenswrapper[4685]: I0321 03:53:32.877041 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="53ad7ff2-a7d3-4ad6-8e97-14542e4a99cd" containerName="registry-server" Mar 21 03:53:32 crc kubenswrapper[4685]: I0321 03:53:32.877662 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gn69w" Mar 21 03:53:32 crc kubenswrapper[4685]: I0321 03:53:32.882685 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 21 03:53:32 crc kubenswrapper[4685]: I0321 03:53:32.888850 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gn69w"] Mar 21 03:53:32 crc kubenswrapper[4685]: I0321 03:53:32.939026 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8038baba-1420-4797-9752-5490c0940929-catalog-content\") pod \"community-operators-gn69w\" (UID: \"8038baba-1420-4797-9752-5490c0940929\") " pod="openshift-marketplace/community-operators-gn69w" Mar 21 03:53:32 crc kubenswrapper[4685]: I0321 03:53:32.939210 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzlw7\" (UniqueName: \"kubernetes.io/projected/8038baba-1420-4797-9752-5490c0940929-kube-api-access-gzlw7\") pod \"community-operators-gn69w\" (UID: \"8038baba-1420-4797-9752-5490c0940929\") " pod="openshift-marketplace/community-operators-gn69w" Mar 21 03:53:32 crc kubenswrapper[4685]: I0321 03:53:32.939422 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8038baba-1420-4797-9752-5490c0940929-utilities\") pod \"community-operators-gn69w\" (UID: \"8038baba-1420-4797-9752-5490c0940929\") " pod="openshift-marketplace/community-operators-gn69w" Mar 21 03:53:33 crc kubenswrapper[4685]: I0321 03:53:33.041424 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8038baba-1420-4797-9752-5490c0940929-catalog-content\") pod \"community-operators-gn69w\" (UID: \"8038baba-1420-4797-9752-5490c0940929\") " pod="openshift-marketplace/community-operators-gn69w" Mar 21 03:53:33 crc kubenswrapper[4685]: I0321 03:53:33.041485 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzlw7\" (UniqueName: \"kubernetes.io/projected/8038baba-1420-4797-9752-5490c0940929-kube-api-access-gzlw7\") pod \"community-operators-gn69w\" (UID: \"8038baba-1420-4797-9752-5490c0940929\") " pod="openshift-marketplace/community-operators-gn69w" Mar 21 03:53:33 crc kubenswrapper[4685]: I0321 03:53:33.041553 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8038baba-1420-4797-9752-5490c0940929-utilities\") pod \"community-operators-gn69w\" (UID: \"8038baba-1420-4797-9752-5490c0940929\") " pod="openshift-marketplace/community-operators-gn69w" Mar 21 03:53:33 crc kubenswrapper[4685]: I0321 03:53:33.042166 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8038baba-1420-4797-9752-5490c0940929-catalog-content\") pod \"community-operators-gn69w\" (UID: \"8038baba-1420-4797-9752-5490c0940929\") " pod="openshift-marketplace/community-operators-gn69w" Mar 21 03:53:33 crc kubenswrapper[4685]: I0321 03:53:33.042360 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8038baba-1420-4797-9752-5490c0940929-utilities\") pod \"community-operators-gn69w\" (UID: \"8038baba-1420-4797-9752-5490c0940929\") " pod="openshift-marketplace/community-operators-gn69w" Mar 21 03:53:33 crc kubenswrapper[4685]: I0321 03:53:33.059029 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzlw7\" (UniqueName: \"kubernetes.io/projected/8038baba-1420-4797-9752-5490c0940929-kube-api-access-gzlw7\") pod \"community-operators-gn69w\" (UID: \"8038baba-1420-4797-9752-5490c0940929\") " pod="openshift-marketplace/community-operators-gn69w" Mar 21 03:53:33 crc kubenswrapper[4685]: I0321 03:53:33.195984 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gn69w" Mar 21 03:53:33 crc kubenswrapper[4685]: I0321 03:53:33.464922 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gn69w"] Mar 21 03:53:33 crc kubenswrapper[4685]: W0321 03:53:33.473520 4685 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8038baba_1420_4797_9752_5490c0940929.slice/crio-cfd5f52d2edfc98a902a01c4072ec7559a3978c1d37eb7668d0cba9d2887b04b WatchSource:0}: Error finding container cfd5f52d2edfc98a902a01c4072ec7559a3978c1d37eb7668d0cba9d2887b04b: Status 404 returned error can't find the container with id cfd5f52d2edfc98a902a01c4072ec7559a3978c1d37eb7668d0cba9d2887b04b Mar 21 03:53:33 crc kubenswrapper[4685]: I0321 03:53:33.673869 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-7wpx4"] Mar 21 03:53:33 crc kubenswrapper[4685]: I0321 03:53:33.674795 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-7wpx4" Mar 21 03:53:33 crc kubenswrapper[4685]: I0321 03:53:33.732819 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-7wpx4"] Mar 21 03:53:33 crc kubenswrapper[4685]: I0321 03:53:33.751361 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/df49233b-854a-4084-9336-28cf93963eb0-ca-trust-extracted\") pod \"image-registry-66df7c8f76-7wpx4\" (UID: \"df49233b-854a-4084-9336-28cf93963eb0\") " pod="openshift-image-registry/image-registry-66df7c8f76-7wpx4" Mar 21 03:53:33 crc kubenswrapper[4685]: I0321 03:53:33.751407 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6h9fz\" (UniqueName: \"kubernetes.io/projected/df49233b-854a-4084-9336-28cf93963eb0-kube-api-access-6h9fz\") pod \"image-registry-66df7c8f76-7wpx4\" (UID: \"df49233b-854a-4084-9336-28cf93963eb0\") " pod="openshift-image-registry/image-registry-66df7c8f76-7wpx4" Mar 21 03:53:33 crc kubenswrapper[4685]: I0321 03:53:33.751447 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/df49233b-854a-4084-9336-28cf93963eb0-registry-certificates\") pod \"image-registry-66df7c8f76-7wpx4\" (UID: \"df49233b-854a-4084-9336-28cf93963eb0\") " pod="openshift-image-registry/image-registry-66df7c8f76-7wpx4" Mar 21 03:53:33 crc kubenswrapper[4685]: I0321 03:53:33.751469 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/df49233b-854a-4084-9336-28cf93963eb0-bound-sa-token\") pod \"image-registry-66df7c8f76-7wpx4\" (UID: \"df49233b-854a-4084-9336-28cf93963eb0\") " pod="openshift-image-registry/image-registry-66df7c8f76-7wpx4" Mar 21 03:53:33 crc kubenswrapper[4685]: I0321 03:53:33.751486 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/df49233b-854a-4084-9336-28cf93963eb0-installation-pull-secrets\") pod \"image-registry-66df7c8f76-7wpx4\" (UID: \"df49233b-854a-4084-9336-28cf93963eb0\") " pod="openshift-image-registry/image-registry-66df7c8f76-7wpx4" Mar 21 03:53:33 crc kubenswrapper[4685]: I0321 03:53:33.751505 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/df49233b-854a-4084-9336-28cf93963eb0-trusted-ca\") pod \"image-registry-66df7c8f76-7wpx4\" (UID: \"df49233b-854a-4084-9336-28cf93963eb0\") " pod="openshift-image-registry/image-registry-66df7c8f76-7wpx4" Mar 21 03:53:33 crc kubenswrapper[4685]: I0321 03:53:33.751693 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/df49233b-854a-4084-9336-28cf93963eb0-registry-tls\") pod \"image-registry-66df7c8f76-7wpx4\" (UID: \"df49233b-854a-4084-9336-28cf93963eb0\") " pod="openshift-image-registry/image-registry-66df7c8f76-7wpx4" Mar 21 03:53:33 crc kubenswrapper[4685]: I0321 03:53:33.751764 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-7wpx4\" (UID: \"df49233b-854a-4084-9336-28cf93963eb0\") " pod="openshift-image-registry/image-registry-66df7c8f76-7wpx4" Mar 21 03:53:33 crc kubenswrapper[4685]: I0321 03:53:33.791198 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-7wpx4\" (UID: \"df49233b-854a-4084-9336-28cf93963eb0\") " pod="openshift-image-registry/image-registry-66df7c8f76-7wpx4" Mar 21 03:53:33 crc kubenswrapper[4685]: I0321 03:53:33.852607 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/df49233b-854a-4084-9336-28cf93963eb0-ca-trust-extracted\") pod \"image-registry-66df7c8f76-7wpx4\" (UID: \"df49233b-854a-4084-9336-28cf93963eb0\") " pod="openshift-image-registry/image-registry-66df7c8f76-7wpx4" Mar 21 03:53:33 crc kubenswrapper[4685]: I0321 03:53:33.852657 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6h9fz\" (UniqueName: \"kubernetes.io/projected/df49233b-854a-4084-9336-28cf93963eb0-kube-api-access-6h9fz\") pod \"image-registry-66df7c8f76-7wpx4\" (UID: \"df49233b-854a-4084-9336-28cf93963eb0\") " pod="openshift-image-registry/image-registry-66df7c8f76-7wpx4" Mar 21 03:53:33 crc kubenswrapper[4685]: I0321 03:53:33.852709 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/df49233b-854a-4084-9336-28cf93963eb0-registry-certificates\") pod \"image-registry-66df7c8f76-7wpx4\" (UID: \"df49233b-854a-4084-9336-28cf93963eb0\") " pod="openshift-image-registry/image-registry-66df7c8f76-7wpx4" Mar 21 03:53:33 crc kubenswrapper[4685]: I0321 03:53:33.852739 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/df49233b-854a-4084-9336-28cf93963eb0-bound-sa-token\") pod \"image-registry-66df7c8f76-7wpx4\" (UID: \"df49233b-854a-4084-9336-28cf93963eb0\") " pod="openshift-image-registry/image-registry-66df7c8f76-7wpx4" Mar 21 03:53:33 crc kubenswrapper[4685]: I0321 03:53:33.852761 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/df49233b-854a-4084-9336-28cf93963eb0-installation-pull-secrets\") pod \"image-registry-66df7c8f76-7wpx4\" (UID: \"df49233b-854a-4084-9336-28cf93963eb0\") " pod="openshift-image-registry/image-registry-66df7c8f76-7wpx4" Mar 21 03:53:33 crc kubenswrapper[4685]: I0321 03:53:33.852789 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/df49233b-854a-4084-9336-28cf93963eb0-trusted-ca\") pod \"image-registry-66df7c8f76-7wpx4\" (UID: \"df49233b-854a-4084-9336-28cf93963eb0\") " pod="openshift-image-registry/image-registry-66df7c8f76-7wpx4" Mar 21 03:53:33 crc kubenswrapper[4685]: I0321 03:53:33.853927 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/df49233b-854a-4084-9336-28cf93963eb0-registry-tls\") pod \"image-registry-66df7c8f76-7wpx4\" (UID: \"df49233b-854a-4084-9336-28cf93963eb0\") " pod="openshift-image-registry/image-registry-66df7c8f76-7wpx4" Mar 21 03:53:33 crc kubenswrapper[4685]: I0321 03:53:33.853545 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/df49233b-854a-4084-9336-28cf93963eb0-ca-trust-extracted\") pod \"image-registry-66df7c8f76-7wpx4\" (UID: \"df49233b-854a-4084-9336-28cf93963eb0\") " pod="openshift-image-registry/image-registry-66df7c8f76-7wpx4" Mar 21 03:53:33 crc kubenswrapper[4685]: I0321 03:53:33.854213 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/df49233b-854a-4084-9336-28cf93963eb0-registry-certificates\") pod \"image-registry-66df7c8f76-7wpx4\" (UID: \"df49233b-854a-4084-9336-28cf93963eb0\") " pod="openshift-image-registry/image-registry-66df7c8f76-7wpx4" Mar 21 03:53:33 crc kubenswrapper[4685]: I0321 03:53:33.854286 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/df49233b-854a-4084-9336-28cf93963eb0-trusted-ca\") pod \"image-registry-66df7c8f76-7wpx4\" (UID: \"df49233b-854a-4084-9336-28cf93963eb0\") " pod="openshift-image-registry/image-registry-66df7c8f76-7wpx4" Mar 21 03:53:33 crc kubenswrapper[4685]: I0321 03:53:33.861623 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/df49233b-854a-4084-9336-28cf93963eb0-registry-tls\") pod \"image-registry-66df7c8f76-7wpx4\" (UID: \"df49233b-854a-4084-9336-28cf93963eb0\") " pod="openshift-image-registry/image-registry-66df7c8f76-7wpx4" Mar 21 03:53:33 crc kubenswrapper[4685]: I0321 03:53:33.865720 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/df49233b-854a-4084-9336-28cf93963eb0-installation-pull-secrets\") pod \"image-registry-66df7c8f76-7wpx4\" (UID: \"df49233b-854a-4084-9336-28cf93963eb0\") " pod="openshift-image-registry/image-registry-66df7c8f76-7wpx4" Mar 21 03:53:33 crc kubenswrapper[4685]: I0321 03:53:33.867378 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/df49233b-854a-4084-9336-28cf93963eb0-bound-sa-token\") pod \"image-registry-66df7c8f76-7wpx4\" (UID: \"df49233b-854a-4084-9336-28cf93963eb0\") " pod="openshift-image-registry/image-registry-66df7c8f76-7wpx4" Mar 21 03:53:33 crc kubenswrapper[4685]: I0321 03:53:33.868207 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6h9fz\" (UniqueName: \"kubernetes.io/projected/df49233b-854a-4084-9336-28cf93963eb0-kube-api-access-6h9fz\") pod \"image-registry-66df7c8f76-7wpx4\" (UID: \"df49233b-854a-4084-9336-28cf93963eb0\") " pod="openshift-image-registry/image-registry-66df7c8f76-7wpx4" Mar 21 03:53:33 crc kubenswrapper[4685]: I0321 03:53:33.901161 4685 generic.go:334] "Generic (PLEG): container finished" podID="8038baba-1420-4797-9752-5490c0940929" containerID="060766379f6feebc02c124ebf663a7f16678bd3ce5e737f8f20e461102dd0474" exitCode=0 Mar 21 03:53:33 crc kubenswrapper[4685]: I0321 03:53:33.901215 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gn69w" event={"ID":"8038baba-1420-4797-9752-5490c0940929","Type":"ContainerDied","Data":"060766379f6feebc02c124ebf663a7f16678bd3ce5e737f8f20e461102dd0474"} Mar 21 03:53:33 crc kubenswrapper[4685]: I0321 03:53:33.901258 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gn69w" event={"ID":"8038baba-1420-4797-9752-5490c0940929","Type":"ContainerStarted","Data":"cfd5f52d2edfc98a902a01c4072ec7559a3978c1d37eb7668d0cba9d2887b04b"} Mar 21 03:53:34 crc kubenswrapper[4685]: I0321 03:53:34.035281 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-7wpx4" Mar 21 03:53:34 crc kubenswrapper[4685]: I0321 03:53:34.278181 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-h85m6"] Mar 21 03:53:34 crc kubenswrapper[4685]: I0321 03:53:34.279383 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h85m6" Mar 21 03:53:34 crc kubenswrapper[4685]: I0321 03:53:34.281600 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 21 03:53:34 crc kubenswrapper[4685]: I0321 03:53:34.283029 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-h85m6"] Mar 21 03:53:34 crc kubenswrapper[4685]: I0321 03:53:34.360300 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43541153-b685-4759-bedf-261b2936431d-utilities\") pod \"certified-operators-h85m6\" (UID: \"43541153-b685-4759-bedf-261b2936431d\") " pod="openshift-marketplace/certified-operators-h85m6" Mar 21 03:53:34 crc kubenswrapper[4685]: I0321 03:53:34.360353 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9cf8\" (UniqueName: \"kubernetes.io/projected/43541153-b685-4759-bedf-261b2936431d-kube-api-access-b9cf8\") pod \"certified-operators-h85m6\" (UID: \"43541153-b685-4759-bedf-261b2936431d\") " pod="openshift-marketplace/certified-operators-h85m6" Mar 21 03:53:34 crc kubenswrapper[4685]: I0321 03:53:34.361133 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43541153-b685-4759-bedf-261b2936431d-catalog-content\") pod \"certified-operators-h85m6\" (UID: \"43541153-b685-4759-bedf-261b2936431d\") " pod="openshift-marketplace/certified-operators-h85m6" Mar 21 03:53:34 crc kubenswrapper[4685]: I0321 03:53:34.463124 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43541153-b685-4759-bedf-261b2936431d-catalog-content\") pod \"certified-operators-h85m6\" (UID: \"43541153-b685-4759-bedf-261b2936431d\") " pod="openshift-marketplace/certified-operators-h85m6" Mar 21 03:53:34 crc kubenswrapper[4685]: I0321 03:53:34.463218 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43541153-b685-4759-bedf-261b2936431d-utilities\") pod \"certified-operators-h85m6\" (UID: \"43541153-b685-4759-bedf-261b2936431d\") " pod="openshift-marketplace/certified-operators-h85m6" Mar 21 03:53:34 crc kubenswrapper[4685]: I0321 03:53:34.463248 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9cf8\" (UniqueName: \"kubernetes.io/projected/43541153-b685-4759-bedf-261b2936431d-kube-api-access-b9cf8\") pod \"certified-operators-h85m6\" (UID: \"43541153-b685-4759-bedf-261b2936431d\") " pod="openshift-marketplace/certified-operators-h85m6" Mar 21 03:53:34 crc kubenswrapper[4685]: I0321 03:53:34.464744 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43541153-b685-4759-bedf-261b2936431d-utilities\") pod \"certified-operators-h85m6\" (UID: \"43541153-b685-4759-bedf-261b2936431d\") " pod="openshift-marketplace/certified-operators-h85m6" Mar 21 03:53:34 crc kubenswrapper[4685]: I0321 03:53:34.465015 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43541153-b685-4759-bedf-261b2936431d-catalog-content\") pod \"certified-operators-h85m6\" (UID: \"43541153-b685-4759-bedf-261b2936431d\") " pod="openshift-marketplace/certified-operators-h85m6" Mar 21 03:53:34 crc kubenswrapper[4685]: I0321 03:53:34.467844 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-7wpx4"] Mar 21 03:53:34 crc kubenswrapper[4685]: W0321 03:53:34.475289 4685 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf49233b_854a_4084_9336_28cf93963eb0.slice/crio-3dfbe5f5b3271bc48123c82f91eaee3db785aca1e0a13670c3eea48b463225a4 WatchSource:0}: Error finding container 3dfbe5f5b3271bc48123c82f91eaee3db785aca1e0a13670c3eea48b463225a4: Status 404 returned error can't find the container with id 3dfbe5f5b3271bc48123c82f91eaee3db785aca1e0a13670c3eea48b463225a4 Mar 21 03:53:34 crc kubenswrapper[4685]: I0321 03:53:34.480930 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9cf8\" (UniqueName: \"kubernetes.io/projected/43541153-b685-4759-bedf-261b2936431d-kube-api-access-b9cf8\") pod \"certified-operators-h85m6\" (UID: \"43541153-b685-4759-bedf-261b2936431d\") " pod="openshift-marketplace/certified-operators-h85m6" Mar 21 03:53:34 crc kubenswrapper[4685]: I0321 03:53:34.601405 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h85m6" Mar 21 03:53:34 crc kubenswrapper[4685]: I0321 03:53:34.818760 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-h85m6"] Mar 21 03:53:34 crc kubenswrapper[4685]: I0321 03:53:34.907173 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h85m6" event={"ID":"43541153-b685-4759-bedf-261b2936431d","Type":"ContainerStarted","Data":"ca55abf50828d0111a60657c22f0a55c73d56b22321d814b136ab803e91acca3"} Mar 21 03:53:34 crc kubenswrapper[4685]: I0321 03:53:34.909747 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-7wpx4" event={"ID":"df49233b-854a-4084-9336-28cf93963eb0","Type":"ContainerStarted","Data":"966af1b602deee151f492b82fb2e8e055e1b1662f4ad08a868bbbe898e62c17d"} Mar 21 03:53:34 crc kubenswrapper[4685]: I0321 03:53:34.909769 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-7wpx4" event={"ID":"df49233b-854a-4084-9336-28cf93963eb0","Type":"ContainerStarted","Data":"3dfbe5f5b3271bc48123c82f91eaee3db785aca1e0a13670c3eea48b463225a4"} Mar 21 03:53:34 crc kubenswrapper[4685]: I0321 03:53:34.910514 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-7wpx4" Mar 21 03:53:34 crc kubenswrapper[4685]: I0321 03:53:34.914392 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gn69w" event={"ID":"8038baba-1420-4797-9752-5490c0940929","Type":"ContainerStarted","Data":"bf8bd92f59b02035dbd37e6bd4be67684f7ecf751697e17fa514c2c8eb1a420c"} Mar 21 03:53:34 crc kubenswrapper[4685]: I0321 03:53:34.948045 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-7wpx4" podStartSLOduration=1.948029558 podStartE2EDuration="1.948029558s" podCreationTimestamp="2026-03-21 03:53:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 03:53:34.930228162 +0000 UTC m=+447.407296954" watchObservedRunningTime="2026-03-21 03:53:34.948029558 +0000 UTC m=+447.425098350" Mar 21 03:53:35 crc kubenswrapper[4685]: I0321 03:53:35.278747 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bcc52"] Mar 21 03:53:35 crc kubenswrapper[4685]: I0321 03:53:35.280061 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bcc52" Mar 21 03:53:35 crc kubenswrapper[4685]: I0321 03:53:35.283217 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 21 03:53:35 crc kubenswrapper[4685]: I0321 03:53:35.290573 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bcc52"] Mar 21 03:53:35 crc kubenswrapper[4685]: I0321 03:53:35.372940 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwrcn\" (UniqueName: \"kubernetes.io/projected/c0edc692-b945-418e-8d2e-129f9c88644e-kube-api-access-gwrcn\") pod \"redhat-marketplace-bcc52\" (UID: \"c0edc692-b945-418e-8d2e-129f9c88644e\") " pod="openshift-marketplace/redhat-marketplace-bcc52" Mar 21 03:53:35 crc kubenswrapper[4685]: I0321 03:53:35.373175 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0edc692-b945-418e-8d2e-129f9c88644e-catalog-content\") pod \"redhat-marketplace-bcc52\" (UID: \"c0edc692-b945-418e-8d2e-129f9c88644e\") " pod="openshift-marketplace/redhat-marketplace-bcc52" Mar 21 03:53:35 crc kubenswrapper[4685]: I0321 03:53:35.373315 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0edc692-b945-418e-8d2e-129f9c88644e-utilities\") pod \"redhat-marketplace-bcc52\" (UID: \"c0edc692-b945-418e-8d2e-129f9c88644e\") " pod="openshift-marketplace/redhat-marketplace-bcc52" Mar 21 03:53:35 crc kubenswrapper[4685]: I0321 03:53:35.474560 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0edc692-b945-418e-8d2e-129f9c88644e-utilities\") pod \"redhat-marketplace-bcc52\" (UID: \"c0edc692-b945-418e-8d2e-129f9c88644e\") " pod="openshift-marketplace/redhat-marketplace-bcc52" Mar 21 03:53:35 crc kubenswrapper[4685]: I0321 03:53:35.474625 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwrcn\" (UniqueName: \"kubernetes.io/projected/c0edc692-b945-418e-8d2e-129f9c88644e-kube-api-access-gwrcn\") pod \"redhat-marketplace-bcc52\" (UID: \"c0edc692-b945-418e-8d2e-129f9c88644e\") " pod="openshift-marketplace/redhat-marketplace-bcc52" Mar 21 03:53:35 crc kubenswrapper[4685]: I0321 03:53:35.474672 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0edc692-b945-418e-8d2e-129f9c88644e-catalog-content\") pod \"redhat-marketplace-bcc52\" (UID: \"c0edc692-b945-418e-8d2e-129f9c88644e\") " pod="openshift-marketplace/redhat-marketplace-bcc52" Mar 21 03:53:35 crc kubenswrapper[4685]: I0321 03:53:35.475172 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0edc692-b945-418e-8d2e-129f9c88644e-catalog-content\") pod \"redhat-marketplace-bcc52\" (UID: \"c0edc692-b945-418e-8d2e-129f9c88644e\") " pod="openshift-marketplace/redhat-marketplace-bcc52" Mar 21 03:53:35 crc kubenswrapper[4685]: I0321 03:53:35.475553 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0edc692-b945-418e-8d2e-129f9c88644e-utilities\") pod \"redhat-marketplace-bcc52\" (UID: \"c0edc692-b945-418e-8d2e-129f9c88644e\") " pod="openshift-marketplace/redhat-marketplace-bcc52" Mar 21 03:53:35 crc kubenswrapper[4685]: I0321 03:53:35.505843 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwrcn\" (UniqueName: \"kubernetes.io/projected/c0edc692-b945-418e-8d2e-129f9c88644e-kube-api-access-gwrcn\") pod \"redhat-marketplace-bcc52\" (UID: \"c0edc692-b945-418e-8d2e-129f9c88644e\") " pod="openshift-marketplace/redhat-marketplace-bcc52" Mar 21 03:53:35 crc kubenswrapper[4685]: I0321 03:53:35.613171 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bcc52" Mar 21 03:53:35 crc kubenswrapper[4685]: I0321 03:53:35.920618 4685 generic.go:334] "Generic (PLEG): container finished" podID="8038baba-1420-4797-9752-5490c0940929" containerID="bf8bd92f59b02035dbd37e6bd4be67684f7ecf751697e17fa514c2c8eb1a420c" exitCode=0 Mar 21 03:53:35 crc kubenswrapper[4685]: I0321 03:53:35.920868 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gn69w" event={"ID":"8038baba-1420-4797-9752-5490c0940929","Type":"ContainerDied","Data":"bf8bd92f59b02035dbd37e6bd4be67684f7ecf751697e17fa514c2c8eb1a420c"} Mar 21 03:53:35 crc kubenswrapper[4685]: I0321 03:53:35.922076 4685 generic.go:334] "Generic (PLEG): container finished" podID="43541153-b685-4759-bedf-261b2936431d" containerID="4704ec86fc0dcf1fe44edd8ba79b5a56b28bd84d98eca97d95152252924d669f" exitCode=0 Mar 21 03:53:35 crc kubenswrapper[4685]: I0321 03:53:35.922732 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h85m6" event={"ID":"43541153-b685-4759-bedf-261b2936431d","Type":"ContainerDied","Data":"4704ec86fc0dcf1fe44edd8ba79b5a56b28bd84d98eca97d95152252924d669f"} Mar 21 03:53:36 crc kubenswrapper[4685]: I0321 03:53:36.072647 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bcc52"] Mar 21 03:53:36 crc kubenswrapper[4685]: I0321 03:53:36.929236 4685 generic.go:334] "Generic (PLEG): container finished" podID="c0edc692-b945-418e-8d2e-129f9c88644e" containerID="9fc99856ef706f32159e873cc9eab306f2e5673ccf0cf674ea45490d732377f8" exitCode=0 Mar 21 03:53:36 crc kubenswrapper[4685]: I0321 03:53:36.929926 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bcc52" event={"ID":"c0edc692-b945-418e-8d2e-129f9c88644e","Type":"ContainerDied","Data":"9fc99856ef706f32159e873cc9eab306f2e5673ccf0cf674ea45490d732377f8"} Mar 21 03:53:36 crc kubenswrapper[4685]: I0321 03:53:36.930103 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bcc52" event={"ID":"c0edc692-b945-418e-8d2e-129f9c88644e","Type":"ContainerStarted","Data":"7c0c46609a13398a887efb976900297d26ddb8c470aba7d337e043e640b704e9"} Mar 21 03:53:36 crc kubenswrapper[4685]: I0321 03:53:36.933386 4685 generic.go:334] "Generic (PLEG): container finished" podID="43541153-b685-4759-bedf-261b2936431d" containerID="18a38dc736b661edcfd0e7aa43027b0d0bc32984fb0c38be0af4f76a0ad2aa72" exitCode=0 Mar 21 03:53:36 crc kubenswrapper[4685]: I0321 03:53:36.933513 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h85m6" event={"ID":"43541153-b685-4759-bedf-261b2936431d","Type":"ContainerDied","Data":"18a38dc736b661edcfd0e7aa43027b0d0bc32984fb0c38be0af4f76a0ad2aa72"} Mar 21 03:53:36 crc kubenswrapper[4685]: I0321 03:53:36.936264 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gn69w" event={"ID":"8038baba-1420-4797-9752-5490c0940929","Type":"ContainerStarted","Data":"ccc58494e91c8ce72a7ed7746145264b20acf1d402e2bcaff5dc7e9ca01c348f"} Mar 21 03:53:36 crc kubenswrapper[4685]: I0321 03:53:36.997892 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gn69w" podStartSLOduration=2.485047463 podStartE2EDuration="4.997865824s" podCreationTimestamp="2026-03-21 03:53:32 +0000 UTC" firstStartedPulling="2026-03-21 03:53:33.902516836 +0000 UTC m=+446.379585628" lastFinishedPulling="2026-03-21 03:53:36.415335197 +0000 UTC m=+448.892403989" observedRunningTime="2026-03-21 03:53:36.990476422 +0000 UTC m=+449.467545214" watchObservedRunningTime="2026-03-21 03:53:36.997865824 +0000 UTC m=+449.474934636" Mar 21 03:53:37 crc kubenswrapper[4685]: I0321 03:53:37.073111 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ttxzc"] Mar 21 03:53:37 crc kubenswrapper[4685]: I0321 03:53:37.074546 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ttxzc" Mar 21 03:53:37 crc kubenswrapper[4685]: I0321 03:53:37.076274 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 21 03:53:37 crc kubenswrapper[4685]: I0321 03:53:37.085539 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ttxzc"] Mar 21 03:53:37 crc kubenswrapper[4685]: I0321 03:53:37.216052 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5118e92f-b64a-4a3b-b9e7-3902c745dbdd-catalog-content\") pod \"redhat-operators-ttxzc\" (UID: \"5118e92f-b64a-4a3b-b9e7-3902c745dbdd\") " pod="openshift-marketplace/redhat-operators-ttxzc" Mar 21 03:53:37 crc kubenswrapper[4685]: I0321 03:53:37.216127 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpqf5\" (UniqueName: \"kubernetes.io/projected/5118e92f-b64a-4a3b-b9e7-3902c745dbdd-kube-api-access-kpqf5\") pod \"redhat-operators-ttxzc\" (UID: \"5118e92f-b64a-4a3b-b9e7-3902c745dbdd\") " pod="openshift-marketplace/redhat-operators-ttxzc" Mar 21 03:53:37 crc kubenswrapper[4685]: I0321 03:53:37.216208 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5118e92f-b64a-4a3b-b9e7-3902c745dbdd-utilities\") pod \"redhat-operators-ttxzc\" (UID: \"5118e92f-b64a-4a3b-b9e7-3902c745dbdd\") " pod="openshift-marketplace/redhat-operators-ttxzc" Mar 21 03:53:37 crc kubenswrapper[4685]: I0321 03:53:37.317593 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5118e92f-b64a-4a3b-b9e7-3902c745dbdd-utilities\") pod \"redhat-operators-ttxzc\" (UID: \"5118e92f-b64a-4a3b-b9e7-3902c745dbdd\") " pod="openshift-marketplace/redhat-operators-ttxzc" Mar 21 03:53:37 crc kubenswrapper[4685]: I0321 03:53:37.317789 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5118e92f-b64a-4a3b-b9e7-3902c745dbdd-catalog-content\") pod \"redhat-operators-ttxzc\" (UID: \"5118e92f-b64a-4a3b-b9e7-3902c745dbdd\") " pod="openshift-marketplace/redhat-operators-ttxzc" Mar 21 03:53:37 crc kubenswrapper[4685]: I0321 03:53:37.317834 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpqf5\" (UniqueName: \"kubernetes.io/projected/5118e92f-b64a-4a3b-b9e7-3902c745dbdd-kube-api-access-kpqf5\") pod \"redhat-operators-ttxzc\" (UID: \"5118e92f-b64a-4a3b-b9e7-3902c745dbdd\") " pod="openshift-marketplace/redhat-operators-ttxzc" Mar 21 03:53:37 crc kubenswrapper[4685]: I0321 03:53:37.318287 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5118e92f-b64a-4a3b-b9e7-3902c745dbdd-catalog-content\") pod \"redhat-operators-ttxzc\" (UID: \"5118e92f-b64a-4a3b-b9e7-3902c745dbdd\") " pod="openshift-marketplace/redhat-operators-ttxzc" Mar 21 03:53:37 crc kubenswrapper[4685]: I0321 03:53:37.318737 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5118e92f-b64a-4a3b-b9e7-3902c745dbdd-utilities\") pod \"redhat-operators-ttxzc\" (UID: \"5118e92f-b64a-4a3b-b9e7-3902c745dbdd\") " pod="openshift-marketplace/redhat-operators-ttxzc" Mar 21 03:53:37 crc kubenswrapper[4685]: I0321 03:53:37.341310 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpqf5\" (UniqueName: \"kubernetes.io/projected/5118e92f-b64a-4a3b-b9e7-3902c745dbdd-kube-api-access-kpqf5\") pod \"redhat-operators-ttxzc\" (UID: \"5118e92f-b64a-4a3b-b9e7-3902c745dbdd\") " pod="openshift-marketplace/redhat-operators-ttxzc" Mar 21 03:53:37 crc kubenswrapper[4685]: I0321 03:53:37.430249 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ttxzc" Mar 21 03:53:37 crc kubenswrapper[4685]: I0321 03:53:37.836561 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ttxzc"] Mar 21 03:53:37 crc kubenswrapper[4685]: W0321 03:53:37.844643 4685 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5118e92f_b64a_4a3b_b9e7_3902c745dbdd.slice/crio-99fa4f54b927e2865dd4115a02ebbf573c0ebd83bfab8441222adaa97367caa7 WatchSource:0}: Error finding container 99fa4f54b927e2865dd4115a02ebbf573c0ebd83bfab8441222adaa97367caa7: Status 404 returned error can't find the container with id 99fa4f54b927e2865dd4115a02ebbf573c0ebd83bfab8441222adaa97367caa7 Mar 21 03:53:37 crc kubenswrapper[4685]: I0321 03:53:37.943424 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h85m6" event={"ID":"43541153-b685-4759-bedf-261b2936431d","Type":"ContainerStarted","Data":"ed632e12cda9ffe99de2d0d2f5935876dc77b78555a109087d6f89e9cf4cdce5"} Mar 21 03:53:37 crc kubenswrapper[4685]: I0321 03:53:37.945026 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ttxzc" event={"ID":"5118e92f-b64a-4a3b-b9e7-3902c745dbdd","Type":"ContainerStarted","Data":"99fa4f54b927e2865dd4115a02ebbf573c0ebd83bfab8441222adaa97367caa7"} Mar 21 03:53:37 crc kubenswrapper[4685]: I0321 03:53:37.947092 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bcc52" event={"ID":"c0edc692-b945-418e-8d2e-129f9c88644e","Type":"ContainerStarted","Data":"f4b12fa298fcf51396546c90bc7deec7e74c4a04c064ebd4d0f809c6a0d13b8c"} Mar 21 03:53:37 crc kubenswrapper[4685]: I0321 03:53:37.970130 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-h85m6" podStartSLOduration=2.532546227 podStartE2EDuration="3.970110369s" podCreationTimestamp="2026-03-21 03:53:34 +0000 UTC" firstStartedPulling="2026-03-21 03:53:35.923222664 +0000 UTC m=+448.400291456" lastFinishedPulling="2026-03-21 03:53:37.360786806 +0000 UTC m=+449.837855598" observedRunningTime="2026-03-21 03:53:37.96581699 +0000 UTC m=+450.442885782" watchObservedRunningTime="2026-03-21 03:53:37.970110369 +0000 UTC m=+450.447179161" Mar 21 03:53:38 crc kubenswrapper[4685]: I0321 03:53:38.953239 4685 generic.go:334] "Generic (PLEG): container finished" podID="5118e92f-b64a-4a3b-b9e7-3902c745dbdd" containerID="0604c77de6871f2c7163fd8b2639d7c7bc96978ee6841d5ddb3b9709fb33db6d" exitCode=0 Mar 21 03:53:38 crc kubenswrapper[4685]: I0321 03:53:38.953299 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ttxzc" event={"ID":"5118e92f-b64a-4a3b-b9e7-3902c745dbdd","Type":"ContainerDied","Data":"0604c77de6871f2c7163fd8b2639d7c7bc96978ee6841d5ddb3b9709fb33db6d"} Mar 21 03:53:38 crc kubenswrapper[4685]: I0321 03:53:38.955860 4685 generic.go:334] "Generic (PLEG): container finished" podID="c0edc692-b945-418e-8d2e-129f9c88644e" containerID="f4b12fa298fcf51396546c90bc7deec7e74c4a04c064ebd4d0f809c6a0d13b8c" exitCode=0 Mar 21 03:53:38 crc kubenswrapper[4685]: I0321 03:53:38.956576 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bcc52" event={"ID":"c0edc692-b945-418e-8d2e-129f9c88644e","Type":"ContainerDied","Data":"f4b12fa298fcf51396546c90bc7deec7e74c4a04c064ebd4d0f809c6a0d13b8c"} Mar 21 03:53:39 crc kubenswrapper[4685]: I0321 03:53:39.685785 4685 patch_prober.go:28] interesting pod/machine-config-daemon-7r9cg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 03:53:39 crc kubenswrapper[4685]: I0321 03:53:39.686095 4685 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" podUID="cea46fe2-4e41-43ab-a069-cb30fb4e732c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 03:53:39 crc kubenswrapper[4685]: I0321 03:53:39.963087 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bcc52" event={"ID":"c0edc692-b945-418e-8d2e-129f9c88644e","Type":"ContainerStarted","Data":"a50d1aa6bd144043dc71d4774d51c48b9ab58b0e4e6dceb9209015840ff25882"} Mar 21 03:53:39 crc kubenswrapper[4685]: I0321 03:53:39.964893 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ttxzc" event={"ID":"5118e92f-b64a-4a3b-b9e7-3902c745dbdd","Type":"ContainerStarted","Data":"f07ebe729735da99a15d5a894f2585bc0550f54e5f99a187678d531a0d186fda"} Mar 21 03:53:39 crc kubenswrapper[4685]: I0321 03:53:39.982694 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bcc52" podStartSLOduration=2.543918722 podStartE2EDuration="4.982672892s" podCreationTimestamp="2026-03-21 03:53:35 +0000 UTC" firstStartedPulling="2026-03-21 03:53:36.930577647 +0000 UTC m=+449.407646439" lastFinishedPulling="2026-03-21 03:53:39.369331807 +0000 UTC m=+451.846400609" observedRunningTime="2026-03-21 03:53:39.980460316 +0000 UTC m=+452.457529108" watchObservedRunningTime="2026-03-21 03:53:39.982672892 +0000 UTC m=+452.459741684" Mar 21 03:53:40 crc kubenswrapper[4685]: I0321 03:53:40.981531 4685 generic.go:334] "Generic (PLEG): container finished" podID="5118e92f-b64a-4a3b-b9e7-3902c745dbdd" containerID="f07ebe729735da99a15d5a894f2585bc0550f54e5f99a187678d531a0d186fda" exitCode=0 Mar 21 03:53:40 crc kubenswrapper[4685]: I0321 03:53:40.981763 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ttxzc" event={"ID":"5118e92f-b64a-4a3b-b9e7-3902c745dbdd","Type":"ContainerDied","Data":"f07ebe729735da99a15d5a894f2585bc0550f54e5f99a187678d531a0d186fda"} Mar 21 03:53:41 crc kubenswrapper[4685]: I0321 03:53:41.992496 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ttxzc" event={"ID":"5118e92f-b64a-4a3b-b9e7-3902c745dbdd","Type":"ContainerStarted","Data":"7c412ab97acf9b62e3249cecd30c1d5278c6d9453b6ae833fb22ae75e5a2bf3c"} Mar 21 03:53:42 crc kubenswrapper[4685]: I0321 03:53:42.010449 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ttxzc" podStartSLOduration=2.557724438 podStartE2EDuration="5.010425581s" podCreationTimestamp="2026-03-21 03:53:37 +0000 UTC" firstStartedPulling="2026-03-21 03:53:38.954701818 +0000 UTC m=+451.431770600" lastFinishedPulling="2026-03-21 03:53:41.407402951 +0000 UTC m=+453.884471743" observedRunningTime="2026-03-21 03:53:42.009594996 +0000 UTC m=+454.486663808" watchObservedRunningTime="2026-03-21 03:53:42.010425581 +0000 UTC m=+454.487494373" Mar 21 03:53:43 crc kubenswrapper[4685]: I0321 03:53:43.196943 4685 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gn69w" Mar 21 03:53:43 crc kubenswrapper[4685]: I0321 03:53:43.197121 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gn69w" Mar 21 03:53:43 crc kubenswrapper[4685]: I0321 03:53:43.248967 4685 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gn69w" Mar 21 03:53:44 crc kubenswrapper[4685]: I0321 03:53:44.056289 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gn69w" Mar 21 03:53:44 crc kubenswrapper[4685]: I0321 03:53:44.601924 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-h85m6" Mar 21 03:53:44 crc kubenswrapper[4685]: I0321 03:53:44.601987 4685 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-h85m6" Mar 21 03:53:44 crc kubenswrapper[4685]: I0321 03:53:44.654034 4685 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-h85m6" Mar 21 03:53:45 crc kubenswrapper[4685]: I0321 03:53:45.044421 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-h85m6" Mar 21 03:53:45 crc kubenswrapper[4685]: I0321 03:53:45.613607 4685 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bcc52" Mar 21 03:53:45 crc kubenswrapper[4685]: I0321 03:53:45.613652 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bcc52" Mar 21 03:53:45 crc kubenswrapper[4685]: I0321 03:53:45.668749 4685 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bcc52" Mar 21 03:53:46 crc kubenswrapper[4685]: I0321 03:53:46.057480 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bcc52" Mar 21 03:53:47 crc kubenswrapper[4685]: I0321 03:53:47.431337 4685 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ttxzc" Mar 21 03:53:47 crc kubenswrapper[4685]: I0321 03:53:47.431781 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ttxzc" Mar 21 03:53:48 crc kubenswrapper[4685]: I0321 03:53:48.474076 4685 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ttxzc" podUID="5118e92f-b64a-4a3b-b9e7-3902c745dbdd" containerName="registry-server" probeResult="failure" output=< Mar 21 03:53:48 crc kubenswrapper[4685]: timeout: failed to connect service ":50051" within 1s Mar 21 03:53:48 crc kubenswrapper[4685]: > Mar 21 03:53:54 crc kubenswrapper[4685]: I0321 03:53:54.053444 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-7wpx4" Mar 21 03:53:54 crc kubenswrapper[4685]: I0321 03:53:54.120042 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-pjvx8"] Mar 21 03:53:57 crc kubenswrapper[4685]: I0321 03:53:57.501119 4685 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ttxzc" Mar 21 03:53:57 crc kubenswrapper[4685]: I0321 03:53:57.574026 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ttxzc" Mar 21 03:54:00 crc kubenswrapper[4685]: I0321 03:54:00.144076 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567754-5w6w5"] Mar 21 03:54:00 crc kubenswrapper[4685]: I0321 03:54:00.145293 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567754-5w6w5" Mar 21 03:54:00 crc kubenswrapper[4685]: I0321 03:54:00.147666 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567754-5w6w5"] Mar 21 03:54:00 crc kubenswrapper[4685]: I0321 03:54:00.148949 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 03:54:00 crc kubenswrapper[4685]: I0321 03:54:00.148950 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 03:54:00 crc kubenswrapper[4685]: I0321 03:54:00.149923 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k75cc" Mar 21 03:54:00 crc kubenswrapper[4685]: I0321 03:54:00.325133 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8frk\" (UniqueName: \"kubernetes.io/projected/e1ef6e0e-74c2-4e2b-bcfa-70d821d09201-kube-api-access-w8frk\") pod \"auto-csr-approver-29567754-5w6w5\" (UID: \"e1ef6e0e-74c2-4e2b-bcfa-70d821d09201\") " pod="openshift-infra/auto-csr-approver-29567754-5w6w5" Mar 21 03:54:00 crc kubenswrapper[4685]: I0321 03:54:00.426798 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8frk\" (UniqueName: \"kubernetes.io/projected/e1ef6e0e-74c2-4e2b-bcfa-70d821d09201-kube-api-access-w8frk\") pod \"auto-csr-approver-29567754-5w6w5\" (UID: \"e1ef6e0e-74c2-4e2b-bcfa-70d821d09201\") " pod="openshift-infra/auto-csr-approver-29567754-5w6w5" Mar 21 03:54:00 crc kubenswrapper[4685]: I0321 03:54:00.448459 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8frk\" (UniqueName: \"kubernetes.io/projected/e1ef6e0e-74c2-4e2b-bcfa-70d821d09201-kube-api-access-w8frk\") pod \"auto-csr-approver-29567754-5w6w5\" (UID: \"e1ef6e0e-74c2-4e2b-bcfa-70d821d09201\") " pod="openshift-infra/auto-csr-approver-29567754-5w6w5" Mar 21 03:54:00 crc kubenswrapper[4685]: I0321 03:54:00.465682 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567754-5w6w5" Mar 21 03:54:00 crc kubenswrapper[4685]: I0321 03:54:00.883480 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567754-5w6w5"] Mar 21 03:54:01 crc kubenswrapper[4685]: I0321 03:54:01.111667 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567754-5w6w5" event={"ID":"e1ef6e0e-74c2-4e2b-bcfa-70d821d09201","Type":"ContainerStarted","Data":"9e70f1aa2019b3abdecd6e78957a6b003b0b0ffc431e8b29913f0bb24392b227"} Mar 21 03:54:04 crc kubenswrapper[4685]: I0321 03:54:04.128622 4685 generic.go:334] "Generic (PLEG): container finished" podID="e1ef6e0e-74c2-4e2b-bcfa-70d821d09201" containerID="414be56549cd6b11efa98ee004718f676f58a724a219aea231fbba058e444aaa" exitCode=0 Mar 21 03:54:04 crc kubenswrapper[4685]: I0321 03:54:04.128709 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567754-5w6w5" event={"ID":"e1ef6e0e-74c2-4e2b-bcfa-70d821d09201","Type":"ContainerDied","Data":"414be56549cd6b11efa98ee004718f676f58a724a219aea231fbba058e444aaa"} Mar 21 03:54:05 crc kubenswrapper[4685]: I0321 03:54:05.370374 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567754-5w6w5" Mar 21 03:54:05 crc kubenswrapper[4685]: I0321 03:54:05.487607 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8frk\" (UniqueName: \"kubernetes.io/projected/e1ef6e0e-74c2-4e2b-bcfa-70d821d09201-kube-api-access-w8frk\") pod \"e1ef6e0e-74c2-4e2b-bcfa-70d821d09201\" (UID: \"e1ef6e0e-74c2-4e2b-bcfa-70d821d09201\") " Mar 21 03:54:05 crc kubenswrapper[4685]: I0321 03:54:05.493640 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1ef6e0e-74c2-4e2b-bcfa-70d821d09201-kube-api-access-w8frk" (OuterVolumeSpecName: "kube-api-access-w8frk") pod "e1ef6e0e-74c2-4e2b-bcfa-70d821d09201" (UID: "e1ef6e0e-74c2-4e2b-bcfa-70d821d09201"). InnerVolumeSpecName "kube-api-access-w8frk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:54:05 crc kubenswrapper[4685]: I0321 03:54:05.589418 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8frk\" (UniqueName: \"kubernetes.io/projected/e1ef6e0e-74c2-4e2b-bcfa-70d821d09201-kube-api-access-w8frk\") on node \"crc\" DevicePath \"\"" Mar 21 03:54:06 crc kubenswrapper[4685]: I0321 03:54:06.140646 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567754-5w6w5" event={"ID":"e1ef6e0e-74c2-4e2b-bcfa-70d821d09201","Type":"ContainerDied","Data":"9e70f1aa2019b3abdecd6e78957a6b003b0b0ffc431e8b29913f0bb24392b227"} Mar 21 03:54:06 crc kubenswrapper[4685]: I0321 03:54:06.140999 4685 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e70f1aa2019b3abdecd6e78957a6b003b0b0ffc431e8b29913f0bb24392b227" Mar 21 03:54:06 crc kubenswrapper[4685]: I0321 03:54:06.141077 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567754-5w6w5" Mar 21 03:54:06 crc kubenswrapper[4685]: I0321 03:54:06.433217 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567748-zv7h8"] Mar 21 03:54:06 crc kubenswrapper[4685]: I0321 03:54:06.438279 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567748-zv7h8"] Mar 21 03:54:08 crc kubenswrapper[4685]: I0321 03:54:08.313497 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ede3f08-f29b-4cb9-a96f-1c66239498f6" path="/var/lib/kubelet/pods/4ede3f08-f29b-4cb9-a96f-1c66239498f6/volumes" Mar 21 03:54:09 crc kubenswrapper[4685]: I0321 03:54:09.685365 4685 patch_prober.go:28] interesting pod/machine-config-daemon-7r9cg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 03:54:09 crc kubenswrapper[4685]: I0321 03:54:09.685430 4685 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" podUID="cea46fe2-4e41-43ab-a069-cb30fb4e732c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 03:54:09 crc kubenswrapper[4685]: I0321 03:54:09.685478 4685 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" Mar 21 03:54:09 crc kubenswrapper[4685]: I0321 03:54:09.686079 4685 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8ae2ea0f2d37402c06f62c37b02e2377743ae0dc80e5e3ec752094ab6ef40392"} pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 21 03:54:09 crc kubenswrapper[4685]: I0321 03:54:09.686151 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" podUID="cea46fe2-4e41-43ab-a069-cb30fb4e732c" containerName="machine-config-daemon" containerID="cri-o://8ae2ea0f2d37402c06f62c37b02e2377743ae0dc80e5e3ec752094ab6ef40392" gracePeriod=600 Mar 21 03:54:10 crc kubenswrapper[4685]: I0321 03:54:10.166650 4685 generic.go:334] "Generic (PLEG): container finished" podID="cea46fe2-4e41-43ab-a069-cb30fb4e732c" containerID="8ae2ea0f2d37402c06f62c37b02e2377743ae0dc80e5e3ec752094ab6ef40392" exitCode=0 Mar 21 03:54:10 crc kubenswrapper[4685]: I0321 03:54:10.166728 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" event={"ID":"cea46fe2-4e41-43ab-a069-cb30fb4e732c","Type":"ContainerDied","Data":"8ae2ea0f2d37402c06f62c37b02e2377743ae0dc80e5e3ec752094ab6ef40392"} Mar 21 03:54:10 crc kubenswrapper[4685]: I0321 03:54:10.167178 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" event={"ID":"cea46fe2-4e41-43ab-a069-cb30fb4e732c","Type":"ContainerStarted","Data":"51700df58050c3bd486b7492e271833d0dee5610ed2bdc61e612672321528c6c"} Mar 21 03:54:10 crc kubenswrapper[4685]: I0321 03:54:10.167202 4685 scope.go:117] "RemoveContainer" containerID="682dda84970818843427fd441cf0359877cff12a6a10a7f7fad13d60c5ca62f3" Mar 21 03:54:19 crc kubenswrapper[4685]: I0321 03:54:19.167445 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" podUID="5a823511-d878-4e6d-acda-4202e00e3aab" containerName="registry" containerID="cri-o://b518ce61b84c42bf82a7212f4d9e88739f081f111fb39d9a384246703239366b" gracePeriod=30 Mar 21 03:54:20 crc kubenswrapper[4685]: I0321 03:54:19.592559 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:54:20 crc kubenswrapper[4685]: I0321 03:54:19.752065 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tqdgc\" (UniqueName: \"kubernetes.io/projected/5a823511-d878-4e6d-acda-4202e00e3aab-kube-api-access-tqdgc\") pod \"5a823511-d878-4e6d-acda-4202e00e3aab\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " Mar 21 03:54:20 crc kubenswrapper[4685]: I0321 03:54:19.752112 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5a823511-d878-4e6d-acda-4202e00e3aab-installation-pull-secrets\") pod \"5a823511-d878-4e6d-acda-4202e00e3aab\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " Mar 21 03:54:20 crc kubenswrapper[4685]: I0321 03:54:19.752161 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5a823511-d878-4e6d-acda-4202e00e3aab-trusted-ca\") pod \"5a823511-d878-4e6d-acda-4202e00e3aab\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " Mar 21 03:54:20 crc kubenswrapper[4685]: I0321 03:54:19.752325 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"5a823511-d878-4e6d-acda-4202e00e3aab\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " Mar 21 03:54:20 crc kubenswrapper[4685]: I0321 03:54:19.752357 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5a823511-d878-4e6d-acda-4202e00e3aab-registry-tls\") pod \"5a823511-d878-4e6d-acda-4202e00e3aab\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " Mar 21 03:54:20 crc kubenswrapper[4685]: I0321 03:54:19.752389 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5a823511-d878-4e6d-acda-4202e00e3aab-bound-sa-token\") pod \"5a823511-d878-4e6d-acda-4202e00e3aab\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " Mar 21 03:54:20 crc kubenswrapper[4685]: I0321 03:54:19.752408 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5a823511-d878-4e6d-acda-4202e00e3aab-ca-trust-extracted\") pod \"5a823511-d878-4e6d-acda-4202e00e3aab\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " Mar 21 03:54:20 crc kubenswrapper[4685]: I0321 03:54:19.752437 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5a823511-d878-4e6d-acda-4202e00e3aab-registry-certificates\") pod \"5a823511-d878-4e6d-acda-4202e00e3aab\" (UID: \"5a823511-d878-4e6d-acda-4202e00e3aab\") " Mar 21 03:54:20 crc kubenswrapper[4685]: I0321 03:54:19.753373 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a823511-d878-4e6d-acda-4202e00e3aab-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "5a823511-d878-4e6d-acda-4202e00e3aab" (UID: "5a823511-d878-4e6d-acda-4202e00e3aab"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:54:20 crc kubenswrapper[4685]: I0321 03:54:19.754576 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a823511-d878-4e6d-acda-4202e00e3aab-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "5a823511-d878-4e6d-acda-4202e00e3aab" (UID: "5a823511-d878-4e6d-acda-4202e00e3aab"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:54:20 crc kubenswrapper[4685]: I0321 03:54:19.759871 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a823511-d878-4e6d-acda-4202e00e3aab-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "5a823511-d878-4e6d-acda-4202e00e3aab" (UID: "5a823511-d878-4e6d-acda-4202e00e3aab"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 03:54:20 crc kubenswrapper[4685]: I0321 03:54:19.768990 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a823511-d878-4e6d-acda-4202e00e3aab-kube-api-access-tqdgc" (OuterVolumeSpecName: "kube-api-access-tqdgc") pod "5a823511-d878-4e6d-acda-4202e00e3aab" (UID: "5a823511-d878-4e6d-acda-4202e00e3aab"). InnerVolumeSpecName "kube-api-access-tqdgc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:54:20 crc kubenswrapper[4685]: I0321 03:54:19.769320 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a823511-d878-4e6d-acda-4202e00e3aab-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "5a823511-d878-4e6d-acda-4202e00e3aab" (UID: "5a823511-d878-4e6d-acda-4202e00e3aab"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:54:20 crc kubenswrapper[4685]: I0321 03:54:19.769683 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a823511-d878-4e6d-acda-4202e00e3aab-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "5a823511-d878-4e6d-acda-4202e00e3aab" (UID: "5a823511-d878-4e6d-acda-4202e00e3aab"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:54:20 crc kubenswrapper[4685]: I0321 03:54:19.780224 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "5a823511-d878-4e6d-acda-4202e00e3aab" (UID: "5a823511-d878-4e6d-acda-4202e00e3aab"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 21 03:54:20 crc kubenswrapper[4685]: I0321 03:54:19.782736 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a823511-d878-4e6d-acda-4202e00e3aab-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "5a823511-d878-4e6d-acda-4202e00e3aab" (UID: "5a823511-d878-4e6d-acda-4202e00e3aab"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 03:54:20 crc kubenswrapper[4685]: I0321 03:54:19.854073 4685 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5a823511-d878-4e6d-acda-4202e00e3aab-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 21 03:54:20 crc kubenswrapper[4685]: I0321 03:54:19.854104 4685 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5a823511-d878-4e6d-acda-4202e00e3aab-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 21 03:54:20 crc kubenswrapper[4685]: I0321 03:54:19.854114 4685 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5a823511-d878-4e6d-acda-4202e00e3aab-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 21 03:54:20 crc kubenswrapper[4685]: I0321 03:54:19.854122 4685 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5a823511-d878-4e6d-acda-4202e00e3aab-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 21 03:54:20 crc kubenswrapper[4685]: I0321 03:54:19.854131 4685 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5a823511-d878-4e6d-acda-4202e00e3aab-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 21 03:54:20 crc kubenswrapper[4685]: I0321 03:54:19.854140 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tqdgc\" (UniqueName: \"kubernetes.io/projected/5a823511-d878-4e6d-acda-4202e00e3aab-kube-api-access-tqdgc\") on node \"crc\" DevicePath \"\"" Mar 21 03:54:20 crc kubenswrapper[4685]: I0321 03:54:19.854148 4685 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5a823511-d878-4e6d-acda-4202e00e3aab-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 21 03:54:20 crc kubenswrapper[4685]: I0321 03:54:20.230477 4685 generic.go:334] "Generic (PLEG): container finished" podID="5a823511-d878-4e6d-acda-4202e00e3aab" containerID="b518ce61b84c42bf82a7212f4d9e88739f081f111fb39d9a384246703239366b" exitCode=0 Mar 21 03:54:20 crc kubenswrapper[4685]: I0321 03:54:20.230518 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" event={"ID":"5a823511-d878-4e6d-acda-4202e00e3aab","Type":"ContainerDied","Data":"b518ce61b84c42bf82a7212f4d9e88739f081f111fb39d9a384246703239366b"} Mar 21 03:54:20 crc kubenswrapper[4685]: I0321 03:54:20.230549 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" Mar 21 03:54:20 crc kubenswrapper[4685]: I0321 03:54:20.230554 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-pjvx8" event={"ID":"5a823511-d878-4e6d-acda-4202e00e3aab","Type":"ContainerDied","Data":"d523779b5ef9b1c83fe719b88abac26dc872a10f67b416c1a1b169132b4b238e"} Mar 21 03:54:20 crc kubenswrapper[4685]: I0321 03:54:20.230573 4685 scope.go:117] "RemoveContainer" containerID="b518ce61b84c42bf82a7212f4d9e88739f081f111fb39d9a384246703239366b" Mar 21 03:54:20 crc kubenswrapper[4685]: I0321 03:54:20.258219 4685 scope.go:117] "RemoveContainer" containerID="b518ce61b84c42bf82a7212f4d9e88739f081f111fb39d9a384246703239366b" Mar 21 03:54:20 crc kubenswrapper[4685]: E0321 03:54:20.258910 4685 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b518ce61b84c42bf82a7212f4d9e88739f081f111fb39d9a384246703239366b\": container with ID starting with b518ce61b84c42bf82a7212f4d9e88739f081f111fb39d9a384246703239366b not found: ID does not exist" containerID="b518ce61b84c42bf82a7212f4d9e88739f081f111fb39d9a384246703239366b" Mar 21 03:54:20 crc kubenswrapper[4685]: I0321 03:54:20.258948 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b518ce61b84c42bf82a7212f4d9e88739f081f111fb39d9a384246703239366b"} err="failed to get container status \"b518ce61b84c42bf82a7212f4d9e88739f081f111fb39d9a384246703239366b\": rpc error: code = NotFound desc = could not find container \"b518ce61b84c42bf82a7212f4d9e88739f081f111fb39d9a384246703239366b\": container with ID starting with b518ce61b84c42bf82a7212f4d9e88739f081f111fb39d9a384246703239366b not found: ID does not exist" Mar 21 03:54:20 crc kubenswrapper[4685]: I0321 03:54:20.264726 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-pjvx8"] Mar 21 03:54:20 crc kubenswrapper[4685]: I0321 03:54:20.264777 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-pjvx8"] Mar 21 03:54:20 crc kubenswrapper[4685]: I0321 03:54:20.308689 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a823511-d878-4e6d-acda-4202e00e3aab" path="/var/lib/kubelet/pods/5a823511-d878-4e6d-acda-4202e00e3aab/volumes" Mar 21 03:56:00 crc kubenswrapper[4685]: I0321 03:56:00.151936 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567756-c8grb"] Mar 21 03:56:00 crc kubenswrapper[4685]: E0321 03:56:00.153097 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1ef6e0e-74c2-4e2b-bcfa-70d821d09201" containerName="oc" Mar 21 03:56:00 crc kubenswrapper[4685]: I0321 03:56:00.153122 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1ef6e0e-74c2-4e2b-bcfa-70d821d09201" containerName="oc" Mar 21 03:56:00 crc kubenswrapper[4685]: E0321 03:56:00.153157 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a823511-d878-4e6d-acda-4202e00e3aab" containerName="registry" Mar 21 03:56:00 crc kubenswrapper[4685]: I0321 03:56:00.153170 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a823511-d878-4e6d-acda-4202e00e3aab" containerName="registry" Mar 21 03:56:00 crc kubenswrapper[4685]: I0321 03:56:00.153361 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1ef6e0e-74c2-4e2b-bcfa-70d821d09201" containerName="oc" Mar 21 03:56:00 crc kubenswrapper[4685]: I0321 03:56:00.153398 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a823511-d878-4e6d-acda-4202e00e3aab" containerName="registry" Mar 21 03:56:00 crc kubenswrapper[4685]: I0321 03:56:00.154036 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567756-c8grb" Mar 21 03:56:00 crc kubenswrapper[4685]: I0321 03:56:00.156751 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k75cc" Mar 21 03:56:00 crc kubenswrapper[4685]: I0321 03:56:00.157783 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 03:56:00 crc kubenswrapper[4685]: I0321 03:56:00.158060 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 03:56:00 crc kubenswrapper[4685]: I0321 03:56:00.170281 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxxrq\" (UniqueName: \"kubernetes.io/projected/a4329ac2-6343-445e-95a6-09a4c21aeef4-kube-api-access-fxxrq\") pod \"auto-csr-approver-29567756-c8grb\" (UID: \"a4329ac2-6343-445e-95a6-09a4c21aeef4\") " pod="openshift-infra/auto-csr-approver-29567756-c8grb" Mar 21 03:56:00 crc kubenswrapper[4685]: I0321 03:56:00.177923 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567756-c8grb"] Mar 21 03:56:00 crc kubenswrapper[4685]: I0321 03:56:00.271018 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxxrq\" (UniqueName: \"kubernetes.io/projected/a4329ac2-6343-445e-95a6-09a4c21aeef4-kube-api-access-fxxrq\") pod \"auto-csr-approver-29567756-c8grb\" (UID: \"a4329ac2-6343-445e-95a6-09a4c21aeef4\") " pod="openshift-infra/auto-csr-approver-29567756-c8grb" Mar 21 03:56:00 crc kubenswrapper[4685]: I0321 03:56:00.293553 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxxrq\" (UniqueName: \"kubernetes.io/projected/a4329ac2-6343-445e-95a6-09a4c21aeef4-kube-api-access-fxxrq\") pod \"auto-csr-approver-29567756-c8grb\" (UID: \"a4329ac2-6343-445e-95a6-09a4c21aeef4\") " pod="openshift-infra/auto-csr-approver-29567756-c8grb" Mar 21 03:56:00 crc kubenswrapper[4685]: I0321 03:56:00.504926 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567756-c8grb" Mar 21 03:56:00 crc kubenswrapper[4685]: I0321 03:56:00.899233 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567756-c8grb"] Mar 21 03:56:00 crc kubenswrapper[4685]: I0321 03:56:00.904529 4685 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 21 03:56:01 crc kubenswrapper[4685]: I0321 03:56:01.876444 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567756-c8grb" event={"ID":"a4329ac2-6343-445e-95a6-09a4c21aeef4","Type":"ContainerStarted","Data":"870d4234532a73f3e9efc743fa7dceeca9e20a572a8665dd683a0fd17339795b"} Mar 21 03:56:02 crc kubenswrapper[4685]: I0321 03:56:02.885646 4685 generic.go:334] "Generic (PLEG): container finished" podID="a4329ac2-6343-445e-95a6-09a4c21aeef4" containerID="ec65004c87a3ece56139f7b03f69c9b9926eef8f676f1143526c439647935efa" exitCode=0 Mar 21 03:56:02 crc kubenswrapper[4685]: I0321 03:56:02.885713 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567756-c8grb" event={"ID":"a4329ac2-6343-445e-95a6-09a4c21aeef4","Type":"ContainerDied","Data":"ec65004c87a3ece56139f7b03f69c9b9926eef8f676f1143526c439647935efa"} Mar 21 03:56:04 crc kubenswrapper[4685]: I0321 03:56:04.183700 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567756-c8grb" Mar 21 03:56:04 crc kubenswrapper[4685]: I0321 03:56:04.319735 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fxxrq\" (UniqueName: \"kubernetes.io/projected/a4329ac2-6343-445e-95a6-09a4c21aeef4-kube-api-access-fxxrq\") pod \"a4329ac2-6343-445e-95a6-09a4c21aeef4\" (UID: \"a4329ac2-6343-445e-95a6-09a4c21aeef4\") " Mar 21 03:56:04 crc kubenswrapper[4685]: I0321 03:56:04.331869 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4329ac2-6343-445e-95a6-09a4c21aeef4-kube-api-access-fxxrq" (OuterVolumeSpecName: "kube-api-access-fxxrq") pod "a4329ac2-6343-445e-95a6-09a4c21aeef4" (UID: "a4329ac2-6343-445e-95a6-09a4c21aeef4"). InnerVolumeSpecName "kube-api-access-fxxrq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:56:04 crc kubenswrapper[4685]: I0321 03:56:04.421871 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fxxrq\" (UniqueName: \"kubernetes.io/projected/a4329ac2-6343-445e-95a6-09a4c21aeef4-kube-api-access-fxxrq\") on node \"crc\" DevicePath \"\"" Mar 21 03:56:04 crc kubenswrapper[4685]: I0321 03:56:04.902217 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567756-c8grb" event={"ID":"a4329ac2-6343-445e-95a6-09a4c21aeef4","Type":"ContainerDied","Data":"870d4234532a73f3e9efc743fa7dceeca9e20a572a8665dd683a0fd17339795b"} Mar 21 03:56:04 crc kubenswrapper[4685]: I0321 03:56:04.902253 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567756-c8grb" Mar 21 03:56:04 crc kubenswrapper[4685]: I0321 03:56:04.902263 4685 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="870d4234532a73f3e9efc743fa7dceeca9e20a572a8665dd683a0fd17339795b" Mar 21 03:56:05 crc kubenswrapper[4685]: I0321 03:56:05.261150 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567750-hlpdj"] Mar 21 03:56:05 crc kubenswrapper[4685]: I0321 03:56:05.268408 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567750-hlpdj"] Mar 21 03:56:06 crc kubenswrapper[4685]: I0321 03:56:06.309976 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a68d63c-113c-4421-9444-78d05d636874" path="/var/lib/kubelet/pods/2a68d63c-113c-4421-9444-78d05d636874/volumes" Mar 21 03:56:09 crc kubenswrapper[4685]: I0321 03:56:09.687029 4685 patch_prober.go:28] interesting pod/machine-config-daemon-7r9cg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 03:56:09 crc kubenswrapper[4685]: I0321 03:56:09.688664 4685 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" podUID="cea46fe2-4e41-43ab-a069-cb30fb4e732c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 03:56:27 crc kubenswrapper[4685]: I0321 03:56:27.679575 4685 scope.go:117] "RemoveContainer" containerID="417a39165b63bc101bfa9471132fb58a4c884a0229b4c6c2c08111c875b2e605" Mar 21 03:56:39 crc kubenswrapper[4685]: I0321 03:56:39.685623 4685 patch_prober.go:28] interesting pod/machine-config-daemon-7r9cg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 03:56:39 crc kubenswrapper[4685]: I0321 03:56:39.686509 4685 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" podUID="cea46fe2-4e41-43ab-a069-cb30fb4e732c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 03:57:09 crc kubenswrapper[4685]: I0321 03:57:09.685350 4685 patch_prober.go:28] interesting pod/machine-config-daemon-7r9cg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 03:57:09 crc kubenswrapper[4685]: I0321 03:57:09.686180 4685 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" podUID="cea46fe2-4e41-43ab-a069-cb30fb4e732c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 03:57:09 crc kubenswrapper[4685]: I0321 03:57:09.686264 4685 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" Mar 21 03:57:09 crc kubenswrapper[4685]: I0321 03:57:09.688544 4685 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"51700df58050c3bd486b7492e271833d0dee5610ed2bdc61e612672321528c6c"} pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 21 03:57:09 crc kubenswrapper[4685]: I0321 03:57:09.688738 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" podUID="cea46fe2-4e41-43ab-a069-cb30fb4e732c" containerName="machine-config-daemon" containerID="cri-o://51700df58050c3bd486b7492e271833d0dee5610ed2bdc61e612672321528c6c" gracePeriod=600 Mar 21 03:57:10 crc kubenswrapper[4685]: I0321 03:57:10.364283 4685 generic.go:334] "Generic (PLEG): container finished" podID="cea46fe2-4e41-43ab-a069-cb30fb4e732c" containerID="51700df58050c3bd486b7492e271833d0dee5610ed2bdc61e612672321528c6c" exitCode=0 Mar 21 03:57:10 crc kubenswrapper[4685]: I0321 03:57:10.364340 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" event={"ID":"cea46fe2-4e41-43ab-a069-cb30fb4e732c","Type":"ContainerDied","Data":"51700df58050c3bd486b7492e271833d0dee5610ed2bdc61e612672321528c6c"} Mar 21 03:57:10 crc kubenswrapper[4685]: I0321 03:57:10.364696 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" event={"ID":"cea46fe2-4e41-43ab-a069-cb30fb4e732c","Type":"ContainerStarted","Data":"da8be442c3ea2f96e685bee081e96f02736707ffa414186cd8dedbc178b8c1c5"} Mar 21 03:57:10 crc kubenswrapper[4685]: I0321 03:57:10.364732 4685 scope.go:117] "RemoveContainer" containerID="8ae2ea0f2d37402c06f62c37b02e2377743ae0dc80e5e3ec752094ab6ef40392" Mar 21 03:57:27 crc kubenswrapper[4685]: I0321 03:57:27.715577 4685 scope.go:117] "RemoveContainer" containerID="15cadc160d591c0cd64e34f00ca0765d613d878a4c236a5abb768732fbb0c4de" Mar 21 03:57:27 crc kubenswrapper[4685]: I0321 03:57:27.756414 4685 scope.go:117] "RemoveContainer" containerID="c4109a761f654c4cea80457a301a9250eeca2a8dc37ed5a953f9fa2bbae1da3c" Mar 21 03:57:27 crc kubenswrapper[4685]: I0321 03:57:27.772033 4685 scope.go:117] "RemoveContainer" containerID="83f58c6672a9d219e7c6fe2461498dcaa634db2007f4484fb3e6c7781cfa6bf9" Mar 21 03:57:27 crc kubenswrapper[4685]: I0321 03:57:27.801232 4685 scope.go:117] "RemoveContainer" containerID="a3c984485cfb2991b4602fbfe5daaa8e39fbcbabd4aee25932bc4653a5affe76" Mar 21 03:57:27 crc kubenswrapper[4685]: I0321 03:57:27.815006 4685 scope.go:117] "RemoveContainer" containerID="31deeee9e2e17aaa1f7fce67d10f5816041023ee773c3b8ac942aa018c0dcde1" Mar 21 03:58:00 crc kubenswrapper[4685]: I0321 03:58:00.136380 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567758-77xb4"] Mar 21 03:58:00 crc kubenswrapper[4685]: E0321 03:58:00.138276 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4329ac2-6343-445e-95a6-09a4c21aeef4" containerName="oc" Mar 21 03:58:00 crc kubenswrapper[4685]: I0321 03:58:00.138332 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4329ac2-6343-445e-95a6-09a4c21aeef4" containerName="oc" Mar 21 03:58:00 crc kubenswrapper[4685]: I0321 03:58:00.138494 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4329ac2-6343-445e-95a6-09a4c21aeef4" containerName="oc" Mar 21 03:58:00 crc kubenswrapper[4685]: I0321 03:58:00.138959 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567758-77xb4" Mar 21 03:58:00 crc kubenswrapper[4685]: I0321 03:58:00.142905 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k75cc" Mar 21 03:58:00 crc kubenswrapper[4685]: I0321 03:58:00.143016 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 03:58:00 crc kubenswrapper[4685]: I0321 03:58:00.146609 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 03:58:00 crc kubenswrapper[4685]: I0321 03:58:00.146933 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567758-77xb4"] Mar 21 03:58:00 crc kubenswrapper[4685]: I0321 03:58:00.213451 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzv2h\" (UniqueName: \"kubernetes.io/projected/c94a3bd5-4817-40c9-8adc-d3bdf8e42180-kube-api-access-lzv2h\") pod \"auto-csr-approver-29567758-77xb4\" (UID: \"c94a3bd5-4817-40c9-8adc-d3bdf8e42180\") " pod="openshift-infra/auto-csr-approver-29567758-77xb4" Mar 21 03:58:00 crc kubenswrapper[4685]: I0321 03:58:00.314984 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzv2h\" (UniqueName: \"kubernetes.io/projected/c94a3bd5-4817-40c9-8adc-d3bdf8e42180-kube-api-access-lzv2h\") pod \"auto-csr-approver-29567758-77xb4\" (UID: \"c94a3bd5-4817-40c9-8adc-d3bdf8e42180\") " pod="openshift-infra/auto-csr-approver-29567758-77xb4" Mar 21 03:58:00 crc kubenswrapper[4685]: I0321 03:58:00.341980 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzv2h\" (UniqueName: \"kubernetes.io/projected/c94a3bd5-4817-40c9-8adc-d3bdf8e42180-kube-api-access-lzv2h\") pod \"auto-csr-approver-29567758-77xb4\" (UID: \"c94a3bd5-4817-40c9-8adc-d3bdf8e42180\") " pod="openshift-infra/auto-csr-approver-29567758-77xb4" Mar 21 03:58:00 crc kubenswrapper[4685]: I0321 03:58:00.497156 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567758-77xb4" Mar 21 03:58:00 crc kubenswrapper[4685]: I0321 03:58:00.951877 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567758-77xb4"] Mar 21 03:58:01 crc kubenswrapper[4685]: I0321 03:58:01.061061 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567758-77xb4" event={"ID":"c94a3bd5-4817-40c9-8adc-d3bdf8e42180","Type":"ContainerStarted","Data":"5fe3fd0501eb631bc9a4c6ae9dba39e15f5e9fe205179cdf7e2cd0220bf0d4aa"} Mar 21 03:58:03 crc kubenswrapper[4685]: I0321 03:58:03.074657 4685 generic.go:334] "Generic (PLEG): container finished" podID="c94a3bd5-4817-40c9-8adc-d3bdf8e42180" containerID="5fc29b3be8e21d56ea45ae6e5f1b216d3a227b41111f1fd30378634675fd304a" exitCode=0 Mar 21 03:58:03 crc kubenswrapper[4685]: I0321 03:58:03.074697 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567758-77xb4" event={"ID":"c94a3bd5-4817-40c9-8adc-d3bdf8e42180","Type":"ContainerDied","Data":"5fc29b3be8e21d56ea45ae6e5f1b216d3a227b41111f1fd30378634675fd304a"} Mar 21 03:58:04 crc kubenswrapper[4685]: I0321 03:58:04.339645 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567758-77xb4" Mar 21 03:58:04 crc kubenswrapper[4685]: I0321 03:58:04.363516 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzv2h\" (UniqueName: \"kubernetes.io/projected/c94a3bd5-4817-40c9-8adc-d3bdf8e42180-kube-api-access-lzv2h\") pod \"c94a3bd5-4817-40c9-8adc-d3bdf8e42180\" (UID: \"c94a3bd5-4817-40c9-8adc-d3bdf8e42180\") " Mar 21 03:58:04 crc kubenswrapper[4685]: I0321 03:58:04.369549 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c94a3bd5-4817-40c9-8adc-d3bdf8e42180-kube-api-access-lzv2h" (OuterVolumeSpecName: "kube-api-access-lzv2h") pod "c94a3bd5-4817-40c9-8adc-d3bdf8e42180" (UID: "c94a3bd5-4817-40c9-8adc-d3bdf8e42180"). InnerVolumeSpecName "kube-api-access-lzv2h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:58:04 crc kubenswrapper[4685]: I0321 03:58:04.465235 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzv2h\" (UniqueName: \"kubernetes.io/projected/c94a3bd5-4817-40c9-8adc-d3bdf8e42180-kube-api-access-lzv2h\") on node \"crc\" DevicePath \"\"" Mar 21 03:58:05 crc kubenswrapper[4685]: I0321 03:58:05.090674 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567758-77xb4" event={"ID":"c94a3bd5-4817-40c9-8adc-d3bdf8e42180","Type":"ContainerDied","Data":"5fe3fd0501eb631bc9a4c6ae9dba39e15f5e9fe205179cdf7e2cd0220bf0d4aa"} Mar 21 03:58:05 crc kubenswrapper[4685]: I0321 03:58:05.090711 4685 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5fe3fd0501eb631bc9a4c6ae9dba39e15f5e9fe205179cdf7e2cd0220bf0d4aa" Mar 21 03:58:05 crc kubenswrapper[4685]: I0321 03:58:05.090728 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567758-77xb4" Mar 21 03:58:05 crc kubenswrapper[4685]: I0321 03:58:05.404320 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567752-khgzj"] Mar 21 03:58:05 crc kubenswrapper[4685]: I0321 03:58:05.413915 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567752-khgzj"] Mar 21 03:58:06 crc kubenswrapper[4685]: I0321 03:58:06.310060 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="795b0bef-045f-4b6d-8b0b-60b79ccbded1" path="/var/lib/kubelet/pods/795b0bef-045f-4b6d-8b0b-60b79ccbded1/volumes" Mar 21 03:58:27 crc kubenswrapper[4685]: I0321 03:58:27.877138 4685 scope.go:117] "RemoveContainer" containerID="f1e28daa02ad3b1ab2ee46fac8b55ad548ffac6d7b98447347bf9ea0ae90df6d" Mar 21 03:59:02 crc kubenswrapper[4685]: I0321 03:59:02.663944 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-cpfzk"] Mar 21 03:59:02 crc kubenswrapper[4685]: I0321 03:59:02.665003 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" podUID="08dfc393-0ddb-4bde-9b1f-2a48549f4549" containerName="ovn-controller" containerID="cri-o://473ea0e8600e10ea2a695b213d69795c56c1a399cbc5dc82ddb34f7261a034e5" gracePeriod=30 Mar 21 03:59:02 crc kubenswrapper[4685]: I0321 03:59:02.665376 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" podUID="08dfc393-0ddb-4bde-9b1f-2a48549f4549" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://beb9850d14f3c89154f1960550eed4073643c6b70832bb3958176ac924f5a9fa" gracePeriod=30 Mar 21 03:59:02 crc kubenswrapper[4685]: I0321 03:59:02.665383 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" podUID="08dfc393-0ddb-4bde-9b1f-2a48549f4549" containerName="northd" containerID="cri-o://2215b819a38926d6f34fede86a32762a942f00bc552ca9c0b8df5f17ed47c10a" gracePeriod=30 Mar 21 03:59:02 crc kubenswrapper[4685]: I0321 03:59:02.665450 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" podUID="08dfc393-0ddb-4bde-9b1f-2a48549f4549" containerName="ovn-acl-logging" containerID="cri-o://d89981c8d56dfb581289dc794c8402f5fde866fd6948c2c0a1940eb3ba99a320" gracePeriod=30 Mar 21 03:59:02 crc kubenswrapper[4685]: I0321 03:59:02.665615 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" podUID="08dfc393-0ddb-4bde-9b1f-2a48549f4549" containerName="sbdb" containerID="cri-o://86328f182f7db62c6133733a4a4b9171f479c5b134c11250bb1839234544f485" gracePeriod=30 Mar 21 03:59:02 crc kubenswrapper[4685]: I0321 03:59:02.665626 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" podUID="08dfc393-0ddb-4bde-9b1f-2a48549f4549" containerName="kube-rbac-proxy-node" containerID="cri-o://a4ad07f9e8bc73e15364d59e6baf04c54550518537bac8cbfcf180b423a1da4f" gracePeriod=30 Mar 21 03:59:02 crc kubenswrapper[4685]: I0321 03:59:02.665685 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" podUID="08dfc393-0ddb-4bde-9b1f-2a48549f4549" containerName="nbdb" containerID="cri-o://34752fc66f7ecfa5c0a9a761bc6a69d21ab1fa332ecd79948cb6b1ea34a8a8b8" gracePeriod=30 Mar 21 03:59:02 crc kubenswrapper[4685]: I0321 03:59:02.701546 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" podUID="08dfc393-0ddb-4bde-9b1f-2a48549f4549" containerName="ovnkube-controller" containerID="cri-o://e3b8a82bebc823be171b9425db7ef881f4918b904d172973b06ee5b15fe79a15" gracePeriod=30 Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.008117 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cpfzk_08dfc393-0ddb-4bde-9b1f-2a48549f4549/ovnkube-controller/3.log" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.010175 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cpfzk_08dfc393-0ddb-4bde-9b1f-2a48549f4549/ovn-acl-logging/0.log" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.010647 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cpfzk_08dfc393-0ddb-4bde-9b1f-2a48549f4549/ovn-controller/0.log" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.011049 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.063788 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-pb8hx"] Mar 21 03:59:03 crc kubenswrapper[4685]: E0321 03:59:03.063981 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08dfc393-0ddb-4bde-9b1f-2a48549f4549" containerName="ovn-acl-logging" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.063994 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="08dfc393-0ddb-4bde-9b1f-2a48549f4549" containerName="ovn-acl-logging" Mar 21 03:59:03 crc kubenswrapper[4685]: E0321 03:59:03.064002 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08dfc393-0ddb-4bde-9b1f-2a48549f4549" containerName="northd" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.064008 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="08dfc393-0ddb-4bde-9b1f-2a48549f4549" containerName="northd" Mar 21 03:59:03 crc kubenswrapper[4685]: E0321 03:59:03.064018 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08dfc393-0ddb-4bde-9b1f-2a48549f4549" containerName="kubecfg-setup" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.064024 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="08dfc393-0ddb-4bde-9b1f-2a48549f4549" containerName="kubecfg-setup" Mar 21 03:59:03 crc kubenswrapper[4685]: E0321 03:59:03.064032 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08dfc393-0ddb-4bde-9b1f-2a48549f4549" containerName="ovnkube-controller" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.064038 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="08dfc393-0ddb-4bde-9b1f-2a48549f4549" containerName="ovnkube-controller" Mar 21 03:59:03 crc kubenswrapper[4685]: E0321 03:59:03.064044 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08dfc393-0ddb-4bde-9b1f-2a48549f4549" containerName="nbdb" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.064049 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="08dfc393-0ddb-4bde-9b1f-2a48549f4549" containerName="nbdb" Mar 21 03:59:03 crc kubenswrapper[4685]: E0321 03:59:03.064058 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c94a3bd5-4817-40c9-8adc-d3bdf8e42180" containerName="oc" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.064065 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="c94a3bd5-4817-40c9-8adc-d3bdf8e42180" containerName="oc" Mar 21 03:59:03 crc kubenswrapper[4685]: E0321 03:59:03.064073 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08dfc393-0ddb-4bde-9b1f-2a48549f4549" containerName="kube-rbac-proxy-node" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.064080 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="08dfc393-0ddb-4bde-9b1f-2a48549f4549" containerName="kube-rbac-proxy-node" Mar 21 03:59:03 crc kubenswrapper[4685]: E0321 03:59:03.064092 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08dfc393-0ddb-4bde-9b1f-2a48549f4549" containerName="ovnkube-controller" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.064098 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="08dfc393-0ddb-4bde-9b1f-2a48549f4549" containerName="ovnkube-controller" Mar 21 03:59:03 crc kubenswrapper[4685]: E0321 03:59:03.064105 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08dfc393-0ddb-4bde-9b1f-2a48549f4549" containerName="kube-rbac-proxy-ovn-metrics" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.064110 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="08dfc393-0ddb-4bde-9b1f-2a48549f4549" containerName="kube-rbac-proxy-ovn-metrics" Mar 21 03:59:03 crc kubenswrapper[4685]: E0321 03:59:03.064119 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08dfc393-0ddb-4bde-9b1f-2a48549f4549" containerName="ovn-controller" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.064125 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="08dfc393-0ddb-4bde-9b1f-2a48549f4549" containerName="ovn-controller" Mar 21 03:59:03 crc kubenswrapper[4685]: E0321 03:59:03.064132 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08dfc393-0ddb-4bde-9b1f-2a48549f4549" containerName="sbdb" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.064137 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="08dfc393-0ddb-4bde-9b1f-2a48549f4549" containerName="sbdb" Mar 21 03:59:03 crc kubenswrapper[4685]: E0321 03:59:03.064147 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08dfc393-0ddb-4bde-9b1f-2a48549f4549" containerName="ovnkube-controller" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.064153 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="08dfc393-0ddb-4bde-9b1f-2a48549f4549" containerName="ovnkube-controller" Mar 21 03:59:03 crc kubenswrapper[4685]: E0321 03:59:03.064161 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08dfc393-0ddb-4bde-9b1f-2a48549f4549" containerName="ovnkube-controller" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.064167 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="08dfc393-0ddb-4bde-9b1f-2a48549f4549" containerName="ovnkube-controller" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.064241 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="08dfc393-0ddb-4bde-9b1f-2a48549f4549" containerName="kube-rbac-proxy-node" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.064252 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="08dfc393-0ddb-4bde-9b1f-2a48549f4549" containerName="ovnkube-controller" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.064259 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="c94a3bd5-4817-40c9-8adc-d3bdf8e42180" containerName="oc" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.064291 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="08dfc393-0ddb-4bde-9b1f-2a48549f4549" containerName="kube-rbac-proxy-ovn-metrics" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.064299 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="08dfc393-0ddb-4bde-9b1f-2a48549f4549" containerName="ovnkube-controller" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.064305 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="08dfc393-0ddb-4bde-9b1f-2a48549f4549" containerName="ovn-acl-logging" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.064313 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="08dfc393-0ddb-4bde-9b1f-2a48549f4549" containerName="ovnkube-controller" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.064320 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="08dfc393-0ddb-4bde-9b1f-2a48549f4549" containerName="northd" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.064328 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="08dfc393-0ddb-4bde-9b1f-2a48549f4549" containerName="sbdb" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.064337 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="08dfc393-0ddb-4bde-9b1f-2a48549f4549" containerName="ovn-controller" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.064342 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="08dfc393-0ddb-4bde-9b1f-2a48549f4549" containerName="nbdb" Mar 21 03:59:03 crc kubenswrapper[4685]: E0321 03:59:03.064420 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08dfc393-0ddb-4bde-9b1f-2a48549f4549" containerName="ovnkube-controller" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.064427 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="08dfc393-0ddb-4bde-9b1f-2a48549f4549" containerName="ovnkube-controller" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.064514 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="08dfc393-0ddb-4bde-9b1f-2a48549f4549" containerName="ovnkube-controller" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.064523 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="08dfc393-0ddb-4bde-9b1f-2a48549f4549" containerName="ovnkube-controller" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.065939 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pb8hx" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.195985 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-host-cni-bin\") pod \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\" (UID: \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\") " Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.196041 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdrfx\" (UniqueName: \"kubernetes.io/projected/08dfc393-0ddb-4bde-9b1f-2a48549f4549-kube-api-access-fdrfx\") pod \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\" (UID: \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\") " Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.196064 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-host-run-netns\") pod \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\" (UID: \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\") " Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.196077 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-host-cni-netd\") pod \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\" (UID: \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\") " Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.196101 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-host-kubelet\") pod \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\" (UID: \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\") " Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.196099 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "08dfc393-0ddb-4bde-9b1f-2a48549f4549" (UID: "08dfc393-0ddb-4bde-9b1f-2a48549f4549"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.196118 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/08dfc393-0ddb-4bde-9b1f-2a48549f4549-ovnkube-script-lib\") pod \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\" (UID: \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\") " Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.196217 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "08dfc393-0ddb-4bde-9b1f-2a48549f4549" (UID: "08dfc393-0ddb-4bde-9b1f-2a48549f4549"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.196259 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "08dfc393-0ddb-4bde-9b1f-2a48549f4549" (UID: "08dfc393-0ddb-4bde-9b1f-2a48549f4549"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.196330 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-var-lib-openvswitch\") pod \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\" (UID: \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\") " Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.196380 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "08dfc393-0ddb-4bde-9b1f-2a48549f4549" (UID: "08dfc393-0ddb-4bde-9b1f-2a48549f4549"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.196419 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-node-log\") pod \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\" (UID: \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\") " Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.196401 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "08dfc393-0ddb-4bde-9b1f-2a48549f4549" (UID: "08dfc393-0ddb-4bde-9b1f-2a48549f4549"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.196469 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-run-systemd\") pod \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\" (UID: \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\") " Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.196467 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-node-log" (OuterVolumeSpecName: "node-log") pod "08dfc393-0ddb-4bde-9b1f-2a48549f4549" (UID: "08dfc393-0ddb-4bde-9b1f-2a48549f4549"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.196494 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-log-socket\") pod \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\" (UID: \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\") " Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.196518 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/08dfc393-0ddb-4bde-9b1f-2a48549f4549-env-overrides\") pod \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\" (UID: \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\") " Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.196541 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-host-var-lib-cni-networks-ovn-kubernetes\") pod \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\" (UID: \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\") " Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.196564 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/08dfc393-0ddb-4bde-9b1f-2a48549f4549-ovn-node-metrics-cert\") pod \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\" (UID: \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\") " Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.196588 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-systemd-units\") pod \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\" (UID: \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\") " Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.196630 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/08dfc393-0ddb-4bde-9b1f-2a48549f4549-ovnkube-config\") pod \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\" (UID: \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\") " Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.196648 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-run-openvswitch\") pod \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\" (UID: \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\") " Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.196666 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-host-run-ovn-kubernetes\") pod \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\" (UID: \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\") " Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.196682 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-run-ovn\") pod \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\" (UID: \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\") " Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.196724 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-host-slash\") pod \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\" (UID: \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\") " Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.196740 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-etc-openvswitch\") pod \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\" (UID: \"08dfc393-0ddb-4bde-9b1f-2a48549f4549\") " Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.196880 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08dfc393-0ddb-4bde-9b1f-2a48549f4549-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "08dfc393-0ddb-4bde-9b1f-2a48549f4549" (UID: "08dfc393-0ddb-4bde-9b1f-2a48549f4549"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.196912 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "08dfc393-0ddb-4bde-9b1f-2a48549f4549" (UID: "08dfc393-0ddb-4bde-9b1f-2a48549f4549"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.196931 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c6448261-a026-47ee-89ca-dcb0374602b8-ovnkube-script-lib\") pod \"ovnkube-node-pb8hx\" (UID: \"c6448261-a026-47ee-89ca-dcb0374602b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pb8hx" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.196942 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "08dfc393-0ddb-4bde-9b1f-2a48549f4549" (UID: "08dfc393-0ddb-4bde-9b1f-2a48549f4549"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.196953 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c6448261-a026-47ee-89ca-dcb0374602b8-ovn-node-metrics-cert\") pod \"ovnkube-node-pb8hx\" (UID: \"c6448261-a026-47ee-89ca-dcb0374602b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pb8hx" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.196976 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c6448261-a026-47ee-89ca-dcb0374602b8-host-cni-netd\") pod \"ovnkube-node-pb8hx\" (UID: \"c6448261-a026-47ee-89ca-dcb0374602b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pb8hx" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.196986 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08dfc393-0ddb-4bde-9b1f-2a48549f4549-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "08dfc393-0ddb-4bde-9b1f-2a48549f4549" (UID: "08dfc393-0ddb-4bde-9b1f-2a48549f4549"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.197036 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-log-socket" (OuterVolumeSpecName: "log-socket") pod "08dfc393-0ddb-4bde-9b1f-2a48549f4549" (UID: "08dfc393-0ddb-4bde-9b1f-2a48549f4549"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.196996 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c6448261-a026-47ee-89ca-dcb0374602b8-host-slash\") pod \"ovnkube-node-pb8hx\" (UID: \"c6448261-a026-47ee-89ca-dcb0374602b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pb8hx" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.197058 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "08dfc393-0ddb-4bde-9b1f-2a48549f4549" (UID: "08dfc393-0ddb-4bde-9b1f-2a48549f4549"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.197095 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c6448261-a026-47ee-89ca-dcb0374602b8-host-cni-bin\") pod \"ovnkube-node-pb8hx\" (UID: \"c6448261-a026-47ee-89ca-dcb0374602b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pb8hx" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.197086 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "08dfc393-0ddb-4bde-9b1f-2a48549f4549" (UID: "08dfc393-0ddb-4bde-9b1f-2a48549f4549"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.197110 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c6448261-a026-47ee-89ca-dcb0374602b8-var-lib-openvswitch\") pod \"ovnkube-node-pb8hx\" (UID: \"c6448261-a026-47ee-89ca-dcb0374602b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pb8hx" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.197132 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "08dfc393-0ddb-4bde-9b1f-2a48549f4549" (UID: "08dfc393-0ddb-4bde-9b1f-2a48549f4549"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.197169 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c6448261-a026-47ee-89ca-dcb0374602b8-run-systemd\") pod \"ovnkube-node-pb8hx\" (UID: \"c6448261-a026-47ee-89ca-dcb0374602b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pb8hx" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.197202 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c6448261-a026-47ee-89ca-dcb0374602b8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pb8hx\" (UID: \"c6448261-a026-47ee-89ca-dcb0374602b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pb8hx" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.197566 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c6448261-a026-47ee-89ca-dcb0374602b8-ovnkube-config\") pod \"ovnkube-node-pb8hx\" (UID: \"c6448261-a026-47ee-89ca-dcb0374602b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pb8hx" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.197312 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "08dfc393-0ddb-4bde-9b1f-2a48549f4549" (UID: "08dfc393-0ddb-4bde-9b1f-2a48549f4549"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.197593 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c6448261-a026-47ee-89ca-dcb0374602b8-env-overrides\") pod \"ovnkube-node-pb8hx\" (UID: \"c6448261-a026-47ee-89ca-dcb0374602b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pb8hx" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.197661 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c6448261-a026-47ee-89ca-dcb0374602b8-host-run-ovn-kubernetes\") pod \"ovnkube-node-pb8hx\" (UID: \"c6448261-a026-47ee-89ca-dcb0374602b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pb8hx" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.197705 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c6448261-a026-47ee-89ca-dcb0374602b8-run-openvswitch\") pod \"ovnkube-node-pb8hx\" (UID: \"c6448261-a026-47ee-89ca-dcb0374602b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pb8hx" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.197736 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c6448261-a026-47ee-89ca-dcb0374602b8-node-log\") pod \"ovnkube-node-pb8hx\" (UID: \"c6448261-a026-47ee-89ca-dcb0374602b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pb8hx" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.197773 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c6448261-a026-47ee-89ca-dcb0374602b8-run-ovn\") pod \"ovnkube-node-pb8hx\" (UID: \"c6448261-a026-47ee-89ca-dcb0374602b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pb8hx" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.197825 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c6448261-a026-47ee-89ca-dcb0374602b8-host-kubelet\") pod \"ovnkube-node-pb8hx\" (UID: \"c6448261-a026-47ee-89ca-dcb0374602b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pb8hx" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.197342 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-host-slash" (OuterVolumeSpecName: "host-slash") pod "08dfc393-0ddb-4bde-9b1f-2a48549f4549" (UID: "08dfc393-0ddb-4bde-9b1f-2a48549f4549"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.197494 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08dfc393-0ddb-4bde-9b1f-2a48549f4549-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "08dfc393-0ddb-4bde-9b1f-2a48549f4549" (UID: "08dfc393-0ddb-4bde-9b1f-2a48549f4549"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.197962 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c6448261-a026-47ee-89ca-dcb0374602b8-host-run-netns\") pod \"ovnkube-node-pb8hx\" (UID: \"c6448261-a026-47ee-89ca-dcb0374602b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pb8hx" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.197989 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c6448261-a026-47ee-89ca-dcb0374602b8-etc-openvswitch\") pod \"ovnkube-node-pb8hx\" (UID: \"c6448261-a026-47ee-89ca-dcb0374602b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pb8hx" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.198018 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c6448261-a026-47ee-89ca-dcb0374602b8-log-socket\") pod \"ovnkube-node-pb8hx\" (UID: \"c6448261-a026-47ee-89ca-dcb0374602b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pb8hx" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.198040 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2n4t\" (UniqueName: \"kubernetes.io/projected/c6448261-a026-47ee-89ca-dcb0374602b8-kube-api-access-t2n4t\") pod \"ovnkube-node-pb8hx\" (UID: \"c6448261-a026-47ee-89ca-dcb0374602b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pb8hx" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.198068 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c6448261-a026-47ee-89ca-dcb0374602b8-systemd-units\") pod \"ovnkube-node-pb8hx\" (UID: \"c6448261-a026-47ee-89ca-dcb0374602b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pb8hx" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.198224 4685 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-node-log\") on node \"crc\" DevicePath \"\"" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.198249 4685 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-log-socket\") on node \"crc\" DevicePath \"\"" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.198263 4685 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/08dfc393-0ddb-4bde-9b1f-2a48549f4549-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.198278 4685 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.198291 4685 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.198305 4685 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/08dfc393-0ddb-4bde-9b1f-2a48549f4549-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.198318 4685 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.198330 4685 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.198342 4685 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.198353 4685 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-host-slash\") on node \"crc\" DevicePath \"\"" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.198363 4685 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.198374 4685 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.198387 4685 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.198397 4685 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.198407 4685 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.198419 4685 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/08dfc393-0ddb-4bde-9b1f-2a48549f4549-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.198430 4685 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.202189 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08dfc393-0ddb-4bde-9b1f-2a48549f4549-kube-api-access-fdrfx" (OuterVolumeSpecName: "kube-api-access-fdrfx") pod "08dfc393-0ddb-4bde-9b1f-2a48549f4549" (UID: "08dfc393-0ddb-4bde-9b1f-2a48549f4549"). InnerVolumeSpecName "kube-api-access-fdrfx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.202611 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08dfc393-0ddb-4bde-9b1f-2a48549f4549-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "08dfc393-0ddb-4bde-9b1f-2a48549f4549" (UID: "08dfc393-0ddb-4bde-9b1f-2a48549f4549"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.219978 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "08dfc393-0ddb-4bde-9b1f-2a48549f4549" (UID: "08dfc393-0ddb-4bde-9b1f-2a48549f4549"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.299596 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c6448261-a026-47ee-89ca-dcb0374602b8-etc-openvswitch\") pod \"ovnkube-node-pb8hx\" (UID: \"c6448261-a026-47ee-89ca-dcb0374602b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pb8hx" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.299665 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c6448261-a026-47ee-89ca-dcb0374602b8-host-run-netns\") pod \"ovnkube-node-pb8hx\" (UID: \"c6448261-a026-47ee-89ca-dcb0374602b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pb8hx" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.299691 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c6448261-a026-47ee-89ca-dcb0374602b8-log-socket\") pod \"ovnkube-node-pb8hx\" (UID: \"c6448261-a026-47ee-89ca-dcb0374602b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pb8hx" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.299760 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c6448261-a026-47ee-89ca-dcb0374602b8-log-socket\") pod \"ovnkube-node-pb8hx\" (UID: \"c6448261-a026-47ee-89ca-dcb0374602b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pb8hx" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.299798 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c6448261-a026-47ee-89ca-dcb0374602b8-etc-openvswitch\") pod \"ovnkube-node-pb8hx\" (UID: \"c6448261-a026-47ee-89ca-dcb0374602b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pb8hx" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.299828 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c6448261-a026-47ee-89ca-dcb0374602b8-host-run-netns\") pod \"ovnkube-node-pb8hx\" (UID: \"c6448261-a026-47ee-89ca-dcb0374602b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pb8hx" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.300007 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2n4t\" (UniqueName: \"kubernetes.io/projected/c6448261-a026-47ee-89ca-dcb0374602b8-kube-api-access-t2n4t\") pod \"ovnkube-node-pb8hx\" (UID: \"c6448261-a026-47ee-89ca-dcb0374602b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pb8hx" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.300614 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c6448261-a026-47ee-89ca-dcb0374602b8-systemd-units\") pod \"ovnkube-node-pb8hx\" (UID: \"c6448261-a026-47ee-89ca-dcb0374602b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pb8hx" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.300716 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c6448261-a026-47ee-89ca-dcb0374602b8-systemd-units\") pod \"ovnkube-node-pb8hx\" (UID: \"c6448261-a026-47ee-89ca-dcb0374602b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pb8hx" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.300927 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c6448261-a026-47ee-89ca-dcb0374602b8-ovnkube-script-lib\") pod \"ovnkube-node-pb8hx\" (UID: \"c6448261-a026-47ee-89ca-dcb0374602b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pb8hx" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.302006 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c6448261-a026-47ee-89ca-dcb0374602b8-ovn-node-metrics-cert\") pod \"ovnkube-node-pb8hx\" (UID: \"c6448261-a026-47ee-89ca-dcb0374602b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pb8hx" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.302112 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c6448261-a026-47ee-89ca-dcb0374602b8-host-cni-netd\") pod \"ovnkube-node-pb8hx\" (UID: \"c6448261-a026-47ee-89ca-dcb0374602b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pb8hx" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.302157 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c6448261-a026-47ee-89ca-dcb0374602b8-host-slash\") pod \"ovnkube-node-pb8hx\" (UID: \"c6448261-a026-47ee-89ca-dcb0374602b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pb8hx" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.302208 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c6448261-a026-47ee-89ca-dcb0374602b8-host-cni-bin\") pod \"ovnkube-node-pb8hx\" (UID: \"c6448261-a026-47ee-89ca-dcb0374602b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pb8hx" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.302221 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c6448261-a026-47ee-89ca-dcb0374602b8-host-cni-netd\") pod \"ovnkube-node-pb8hx\" (UID: \"c6448261-a026-47ee-89ca-dcb0374602b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pb8hx" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.302237 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c6448261-a026-47ee-89ca-dcb0374602b8-host-slash\") pod \"ovnkube-node-pb8hx\" (UID: \"c6448261-a026-47ee-89ca-dcb0374602b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pb8hx" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.302244 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c6448261-a026-47ee-89ca-dcb0374602b8-var-lib-openvswitch\") pod \"ovnkube-node-pb8hx\" (UID: \"c6448261-a026-47ee-89ca-dcb0374602b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pb8hx" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.302234 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c6448261-a026-47ee-89ca-dcb0374602b8-ovnkube-script-lib\") pod \"ovnkube-node-pb8hx\" (UID: \"c6448261-a026-47ee-89ca-dcb0374602b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pb8hx" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.302291 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c6448261-a026-47ee-89ca-dcb0374602b8-run-systemd\") pod \"ovnkube-node-pb8hx\" (UID: \"c6448261-a026-47ee-89ca-dcb0374602b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pb8hx" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.302326 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c6448261-a026-47ee-89ca-dcb0374602b8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pb8hx\" (UID: \"c6448261-a026-47ee-89ca-dcb0374602b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pb8hx" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.302334 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c6448261-a026-47ee-89ca-dcb0374602b8-host-cni-bin\") pod \"ovnkube-node-pb8hx\" (UID: \"c6448261-a026-47ee-89ca-dcb0374602b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pb8hx" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.302368 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c6448261-a026-47ee-89ca-dcb0374602b8-ovnkube-config\") pod \"ovnkube-node-pb8hx\" (UID: \"c6448261-a026-47ee-89ca-dcb0374602b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pb8hx" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.302373 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c6448261-a026-47ee-89ca-dcb0374602b8-run-systemd\") pod \"ovnkube-node-pb8hx\" (UID: \"c6448261-a026-47ee-89ca-dcb0374602b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pb8hx" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.302354 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c6448261-a026-47ee-89ca-dcb0374602b8-var-lib-openvswitch\") pod \"ovnkube-node-pb8hx\" (UID: \"c6448261-a026-47ee-89ca-dcb0374602b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pb8hx" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.302403 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c6448261-a026-47ee-89ca-dcb0374602b8-env-overrides\") pod \"ovnkube-node-pb8hx\" (UID: \"c6448261-a026-47ee-89ca-dcb0374602b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pb8hx" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.302435 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c6448261-a026-47ee-89ca-dcb0374602b8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pb8hx\" (UID: \"c6448261-a026-47ee-89ca-dcb0374602b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pb8hx" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.302471 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c6448261-a026-47ee-89ca-dcb0374602b8-host-run-ovn-kubernetes\") pod \"ovnkube-node-pb8hx\" (UID: \"c6448261-a026-47ee-89ca-dcb0374602b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pb8hx" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.302533 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c6448261-a026-47ee-89ca-dcb0374602b8-run-openvswitch\") pod \"ovnkube-node-pb8hx\" (UID: \"c6448261-a026-47ee-89ca-dcb0374602b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pb8hx" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.302582 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c6448261-a026-47ee-89ca-dcb0374602b8-host-run-ovn-kubernetes\") pod \"ovnkube-node-pb8hx\" (UID: \"c6448261-a026-47ee-89ca-dcb0374602b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pb8hx" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.302588 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c6448261-a026-47ee-89ca-dcb0374602b8-node-log\") pod \"ovnkube-node-pb8hx\" (UID: \"c6448261-a026-47ee-89ca-dcb0374602b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pb8hx" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.302662 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c6448261-a026-47ee-89ca-dcb0374602b8-node-log\") pod \"ovnkube-node-pb8hx\" (UID: \"c6448261-a026-47ee-89ca-dcb0374602b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pb8hx" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.302684 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c6448261-a026-47ee-89ca-dcb0374602b8-run-ovn\") pod \"ovnkube-node-pb8hx\" (UID: \"c6448261-a026-47ee-89ca-dcb0374602b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pb8hx" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.302601 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c6448261-a026-47ee-89ca-dcb0374602b8-run-openvswitch\") pod \"ovnkube-node-pb8hx\" (UID: \"c6448261-a026-47ee-89ca-dcb0374602b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pb8hx" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.302745 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c6448261-a026-47ee-89ca-dcb0374602b8-run-ovn\") pod \"ovnkube-node-pb8hx\" (UID: \"c6448261-a026-47ee-89ca-dcb0374602b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pb8hx" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.302802 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c6448261-a026-47ee-89ca-dcb0374602b8-host-kubelet\") pod \"ovnkube-node-pb8hx\" (UID: \"c6448261-a026-47ee-89ca-dcb0374602b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pb8hx" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.302979 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fdrfx\" (UniqueName: \"kubernetes.io/projected/08dfc393-0ddb-4bde-9b1f-2a48549f4549-kube-api-access-fdrfx\") on node \"crc\" DevicePath \"\"" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.303012 4685 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/08dfc393-0ddb-4bde-9b1f-2a48549f4549-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.303037 4685 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/08dfc393-0ddb-4bde-9b1f-2a48549f4549-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.303013 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c6448261-a026-47ee-89ca-dcb0374602b8-env-overrides\") pod \"ovnkube-node-pb8hx\" (UID: \"c6448261-a026-47ee-89ca-dcb0374602b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pb8hx" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.303055 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c6448261-a026-47ee-89ca-dcb0374602b8-host-kubelet\") pod \"ovnkube-node-pb8hx\" (UID: \"c6448261-a026-47ee-89ca-dcb0374602b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pb8hx" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.303154 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c6448261-a026-47ee-89ca-dcb0374602b8-ovnkube-config\") pod \"ovnkube-node-pb8hx\" (UID: \"c6448261-a026-47ee-89ca-dcb0374602b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pb8hx" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.306980 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c6448261-a026-47ee-89ca-dcb0374602b8-ovn-node-metrics-cert\") pod \"ovnkube-node-pb8hx\" (UID: \"c6448261-a026-47ee-89ca-dcb0374602b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pb8hx" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.317519 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2n4t\" (UniqueName: \"kubernetes.io/projected/c6448261-a026-47ee-89ca-dcb0374602b8-kube-api-access-t2n4t\") pod \"ovnkube-node-pb8hx\" (UID: \"c6448261-a026-47ee-89ca-dcb0374602b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pb8hx" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.377876 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pb8hx" Mar 21 03:59:03 crc kubenswrapper[4685]: W0321 03:59:03.406791 4685 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6448261_a026_47ee_89ca_dcb0374602b8.slice/crio-cce2790d1baabeb6aff49d682e4a74bb6a3d4232ec254fdc81cda70d26688ece WatchSource:0}: Error finding container cce2790d1baabeb6aff49d682e4a74bb6a3d4232ec254fdc81cda70d26688ece: Status 404 returned error can't find the container with id cce2790d1baabeb6aff49d682e4a74bb6a3d4232ec254fdc81cda70d26688ece Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.448793 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7jcm2_cd9b1743-6b69-46d3-a429-6f83bf43317a/kube-multus/2.log" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.449778 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7jcm2_cd9b1743-6b69-46d3-a429-6f83bf43317a/kube-multus/1.log" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.449814 4685 generic.go:334] "Generic (PLEG): container finished" podID="cd9b1743-6b69-46d3-a429-6f83bf43317a" containerID="aeb2e6d1910f6dc402503b823272d26ca9f0ffb3b41c3137e50a4345d1710170" exitCode=2 Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.449880 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7jcm2" event={"ID":"cd9b1743-6b69-46d3-a429-6f83bf43317a","Type":"ContainerDied","Data":"aeb2e6d1910f6dc402503b823272d26ca9f0ffb3b41c3137e50a4345d1710170"} Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.449912 4685 scope.go:117] "RemoveContainer" containerID="ffa428e52a3c6324be6bace33035c1626678061a65e2badd389c6e93850ce25f" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.450750 4685 scope.go:117] "RemoveContainer" containerID="aeb2e6d1910f6dc402503b823272d26ca9f0ffb3b41c3137e50a4345d1710170" Mar 21 03:59:03 crc kubenswrapper[4685]: E0321 03:59:03.451163 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-7jcm2_openshift-multus(cd9b1743-6b69-46d3-a429-6f83bf43317a)\"" pod="openshift-multus/multus-7jcm2" podUID="cd9b1743-6b69-46d3-a429-6f83bf43317a" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.455977 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cpfzk_08dfc393-0ddb-4bde-9b1f-2a48549f4549/ovnkube-controller/3.log" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.459578 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cpfzk_08dfc393-0ddb-4bde-9b1f-2a48549f4549/ovn-acl-logging/0.log" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.460150 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cpfzk_08dfc393-0ddb-4bde-9b1f-2a48549f4549/ovn-controller/0.log" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.460513 4685 generic.go:334] "Generic (PLEG): container finished" podID="08dfc393-0ddb-4bde-9b1f-2a48549f4549" containerID="e3b8a82bebc823be171b9425db7ef881f4918b904d172973b06ee5b15fe79a15" exitCode=0 Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.460546 4685 generic.go:334] "Generic (PLEG): container finished" podID="08dfc393-0ddb-4bde-9b1f-2a48549f4549" containerID="86328f182f7db62c6133733a4a4b9171f479c5b134c11250bb1839234544f485" exitCode=0 Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.460556 4685 generic.go:334] "Generic (PLEG): container finished" podID="08dfc393-0ddb-4bde-9b1f-2a48549f4549" containerID="34752fc66f7ecfa5c0a9a761bc6a69d21ab1fa332ecd79948cb6b1ea34a8a8b8" exitCode=0 Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.460551 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" event={"ID":"08dfc393-0ddb-4bde-9b1f-2a48549f4549","Type":"ContainerDied","Data":"e3b8a82bebc823be171b9425db7ef881f4918b904d172973b06ee5b15fe79a15"} Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.460609 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.460611 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" event={"ID":"08dfc393-0ddb-4bde-9b1f-2a48549f4549","Type":"ContainerDied","Data":"86328f182f7db62c6133733a4a4b9171f479c5b134c11250bb1839234544f485"} Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.460637 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" event={"ID":"08dfc393-0ddb-4bde-9b1f-2a48549f4549","Type":"ContainerDied","Data":"34752fc66f7ecfa5c0a9a761bc6a69d21ab1fa332ecd79948cb6b1ea34a8a8b8"} Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.460564 4685 generic.go:334] "Generic (PLEG): container finished" podID="08dfc393-0ddb-4bde-9b1f-2a48549f4549" containerID="2215b819a38926d6f34fede86a32762a942f00bc552ca9c0b8df5f17ed47c10a" exitCode=0 Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.460659 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" event={"ID":"08dfc393-0ddb-4bde-9b1f-2a48549f4549","Type":"ContainerDied","Data":"2215b819a38926d6f34fede86a32762a942f00bc552ca9c0b8df5f17ed47c10a"} Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.460679 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" event={"ID":"08dfc393-0ddb-4bde-9b1f-2a48549f4549","Type":"ContainerDied","Data":"beb9850d14f3c89154f1960550eed4073643c6b70832bb3958176ac924f5a9fa"} Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.460661 4685 generic.go:334] "Generic (PLEG): container finished" podID="08dfc393-0ddb-4bde-9b1f-2a48549f4549" containerID="beb9850d14f3c89154f1960550eed4073643c6b70832bb3958176ac924f5a9fa" exitCode=0 Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.460721 4685 generic.go:334] "Generic (PLEG): container finished" podID="08dfc393-0ddb-4bde-9b1f-2a48549f4549" containerID="a4ad07f9e8bc73e15364d59e6baf04c54550518537bac8cbfcf180b423a1da4f" exitCode=0 Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.460741 4685 generic.go:334] "Generic (PLEG): container finished" podID="08dfc393-0ddb-4bde-9b1f-2a48549f4549" containerID="d89981c8d56dfb581289dc794c8402f5fde866fd6948c2c0a1940eb3ba99a320" exitCode=143 Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.460754 4685 generic.go:334] "Generic (PLEG): container finished" podID="08dfc393-0ddb-4bde-9b1f-2a48549f4549" containerID="473ea0e8600e10ea2a695b213d69795c56c1a399cbc5dc82ddb34f7261a034e5" exitCode=143 Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.460800 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" event={"ID":"08dfc393-0ddb-4bde-9b1f-2a48549f4549","Type":"ContainerDied","Data":"a4ad07f9e8bc73e15364d59e6baf04c54550518537bac8cbfcf180b423a1da4f"} Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.460868 4685 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e3b8a82bebc823be171b9425db7ef881f4918b904d172973b06ee5b15fe79a15"} Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.460885 4685 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f4c5056ea79326435f55aafe0fc9580a47c394a0d04559cd50b5eb720b2f599a"} Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.460896 4685 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"86328f182f7db62c6133733a4a4b9171f479c5b134c11250bb1839234544f485"} Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.460903 4685 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"34752fc66f7ecfa5c0a9a761bc6a69d21ab1fa332ecd79948cb6b1ea34a8a8b8"} Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.460911 4685 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2215b819a38926d6f34fede86a32762a942f00bc552ca9c0b8df5f17ed47c10a"} Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.460922 4685 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"beb9850d14f3c89154f1960550eed4073643c6b70832bb3958176ac924f5a9fa"} Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.460937 4685 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a4ad07f9e8bc73e15364d59e6baf04c54550518537bac8cbfcf180b423a1da4f"} Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.460945 4685 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d89981c8d56dfb581289dc794c8402f5fde866fd6948c2c0a1940eb3ba99a320"} Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.460953 4685 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"473ea0e8600e10ea2a695b213d69795c56c1a399cbc5dc82ddb34f7261a034e5"} Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.460960 4685 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a"} Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.460977 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" event={"ID":"08dfc393-0ddb-4bde-9b1f-2a48549f4549","Type":"ContainerDied","Data":"d89981c8d56dfb581289dc794c8402f5fde866fd6948c2c0a1940eb3ba99a320"} Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.460991 4685 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e3b8a82bebc823be171b9425db7ef881f4918b904d172973b06ee5b15fe79a15"} Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.461001 4685 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f4c5056ea79326435f55aafe0fc9580a47c394a0d04559cd50b5eb720b2f599a"} Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.461009 4685 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"86328f182f7db62c6133733a4a4b9171f479c5b134c11250bb1839234544f485"} Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.461019 4685 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"34752fc66f7ecfa5c0a9a761bc6a69d21ab1fa332ecd79948cb6b1ea34a8a8b8"} Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.461026 4685 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2215b819a38926d6f34fede86a32762a942f00bc552ca9c0b8df5f17ed47c10a"} Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.461034 4685 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"beb9850d14f3c89154f1960550eed4073643c6b70832bb3958176ac924f5a9fa"} Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.461042 4685 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a4ad07f9e8bc73e15364d59e6baf04c54550518537bac8cbfcf180b423a1da4f"} Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.461050 4685 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d89981c8d56dfb581289dc794c8402f5fde866fd6948c2c0a1940eb3ba99a320"} Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.461057 4685 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"473ea0e8600e10ea2a695b213d69795c56c1a399cbc5dc82ddb34f7261a034e5"} Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.461064 4685 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a"} Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.461074 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" event={"ID":"08dfc393-0ddb-4bde-9b1f-2a48549f4549","Type":"ContainerDied","Data":"473ea0e8600e10ea2a695b213d69795c56c1a399cbc5dc82ddb34f7261a034e5"} Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.461085 4685 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e3b8a82bebc823be171b9425db7ef881f4918b904d172973b06ee5b15fe79a15"} Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.461093 4685 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f4c5056ea79326435f55aafe0fc9580a47c394a0d04559cd50b5eb720b2f599a"} Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.461101 4685 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"86328f182f7db62c6133733a4a4b9171f479c5b134c11250bb1839234544f485"} Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.461108 4685 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"34752fc66f7ecfa5c0a9a761bc6a69d21ab1fa332ecd79948cb6b1ea34a8a8b8"} Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.461115 4685 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2215b819a38926d6f34fede86a32762a942f00bc552ca9c0b8df5f17ed47c10a"} Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.461123 4685 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"beb9850d14f3c89154f1960550eed4073643c6b70832bb3958176ac924f5a9fa"} Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.461131 4685 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a4ad07f9e8bc73e15364d59e6baf04c54550518537bac8cbfcf180b423a1da4f"} Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.461138 4685 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d89981c8d56dfb581289dc794c8402f5fde866fd6948c2c0a1940eb3ba99a320"} Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.461146 4685 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"473ea0e8600e10ea2a695b213d69795c56c1a399cbc5dc82ddb34f7261a034e5"} Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.461153 4685 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a"} Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.461163 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cpfzk" event={"ID":"08dfc393-0ddb-4bde-9b1f-2a48549f4549","Type":"ContainerDied","Data":"7dd5895cbe204199a83f73c83d8382055e46c7a26e8e16240fe09e92845509cf"} Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.461175 4685 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e3b8a82bebc823be171b9425db7ef881f4918b904d172973b06ee5b15fe79a15"} Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.461184 4685 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f4c5056ea79326435f55aafe0fc9580a47c394a0d04559cd50b5eb720b2f599a"} Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.461191 4685 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"86328f182f7db62c6133733a4a4b9171f479c5b134c11250bb1839234544f485"} Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.461198 4685 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"34752fc66f7ecfa5c0a9a761bc6a69d21ab1fa332ecd79948cb6b1ea34a8a8b8"} Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.461206 4685 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2215b819a38926d6f34fede86a32762a942f00bc552ca9c0b8df5f17ed47c10a"} Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.461214 4685 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"beb9850d14f3c89154f1960550eed4073643c6b70832bb3958176ac924f5a9fa"} Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.461221 4685 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a4ad07f9e8bc73e15364d59e6baf04c54550518537bac8cbfcf180b423a1da4f"} Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.461229 4685 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d89981c8d56dfb581289dc794c8402f5fde866fd6948c2c0a1940eb3ba99a320"} Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.461236 4685 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"473ea0e8600e10ea2a695b213d69795c56c1a399cbc5dc82ddb34f7261a034e5"} Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.461244 4685 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a"} Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.462225 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pb8hx" event={"ID":"c6448261-a026-47ee-89ca-dcb0374602b8","Type":"ContainerStarted","Data":"cce2790d1baabeb6aff49d682e4a74bb6a3d4232ec254fdc81cda70d26688ece"} Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.495327 4685 scope.go:117] "RemoveContainer" containerID="e3b8a82bebc823be171b9425db7ef881f4918b904d172973b06ee5b15fe79a15" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.497663 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-cpfzk"] Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.502099 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-cpfzk"] Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.510613 4685 scope.go:117] "RemoveContainer" containerID="f4c5056ea79326435f55aafe0fc9580a47c394a0d04559cd50b5eb720b2f599a" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.538137 4685 scope.go:117] "RemoveContainer" containerID="86328f182f7db62c6133733a4a4b9171f479c5b134c11250bb1839234544f485" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.599334 4685 scope.go:117] "RemoveContainer" containerID="34752fc66f7ecfa5c0a9a761bc6a69d21ab1fa332ecd79948cb6b1ea34a8a8b8" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.610160 4685 scope.go:117] "RemoveContainer" containerID="2215b819a38926d6f34fede86a32762a942f00bc552ca9c0b8df5f17ed47c10a" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.623864 4685 scope.go:117] "RemoveContainer" containerID="beb9850d14f3c89154f1960550eed4073643c6b70832bb3958176ac924f5a9fa" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.634579 4685 scope.go:117] "RemoveContainer" containerID="a4ad07f9e8bc73e15364d59e6baf04c54550518537bac8cbfcf180b423a1da4f" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.654618 4685 scope.go:117] "RemoveContainer" containerID="d89981c8d56dfb581289dc794c8402f5fde866fd6948c2c0a1940eb3ba99a320" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.672056 4685 scope.go:117] "RemoveContainer" containerID="473ea0e8600e10ea2a695b213d69795c56c1a399cbc5dc82ddb34f7261a034e5" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.682322 4685 scope.go:117] "RemoveContainer" containerID="246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.692577 4685 scope.go:117] "RemoveContainer" containerID="e3b8a82bebc823be171b9425db7ef881f4918b904d172973b06ee5b15fe79a15" Mar 21 03:59:03 crc kubenswrapper[4685]: E0321 03:59:03.692953 4685 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3b8a82bebc823be171b9425db7ef881f4918b904d172973b06ee5b15fe79a15\": container with ID starting with e3b8a82bebc823be171b9425db7ef881f4918b904d172973b06ee5b15fe79a15 not found: ID does not exist" containerID="e3b8a82bebc823be171b9425db7ef881f4918b904d172973b06ee5b15fe79a15" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.692996 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3b8a82bebc823be171b9425db7ef881f4918b904d172973b06ee5b15fe79a15"} err="failed to get container status \"e3b8a82bebc823be171b9425db7ef881f4918b904d172973b06ee5b15fe79a15\": rpc error: code = NotFound desc = could not find container \"e3b8a82bebc823be171b9425db7ef881f4918b904d172973b06ee5b15fe79a15\": container with ID starting with e3b8a82bebc823be171b9425db7ef881f4918b904d172973b06ee5b15fe79a15 not found: ID does not exist" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.693021 4685 scope.go:117] "RemoveContainer" containerID="f4c5056ea79326435f55aafe0fc9580a47c394a0d04559cd50b5eb720b2f599a" Mar 21 03:59:03 crc kubenswrapper[4685]: E0321 03:59:03.693302 4685 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4c5056ea79326435f55aafe0fc9580a47c394a0d04559cd50b5eb720b2f599a\": container with ID starting with f4c5056ea79326435f55aafe0fc9580a47c394a0d04559cd50b5eb720b2f599a not found: ID does not exist" containerID="f4c5056ea79326435f55aafe0fc9580a47c394a0d04559cd50b5eb720b2f599a" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.693398 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4c5056ea79326435f55aafe0fc9580a47c394a0d04559cd50b5eb720b2f599a"} err="failed to get container status \"f4c5056ea79326435f55aafe0fc9580a47c394a0d04559cd50b5eb720b2f599a\": rpc error: code = NotFound desc = could not find container \"f4c5056ea79326435f55aafe0fc9580a47c394a0d04559cd50b5eb720b2f599a\": container with ID starting with f4c5056ea79326435f55aafe0fc9580a47c394a0d04559cd50b5eb720b2f599a not found: ID does not exist" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.693474 4685 scope.go:117] "RemoveContainer" containerID="86328f182f7db62c6133733a4a4b9171f479c5b134c11250bb1839234544f485" Mar 21 03:59:03 crc kubenswrapper[4685]: E0321 03:59:03.693749 4685 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86328f182f7db62c6133733a4a4b9171f479c5b134c11250bb1839234544f485\": container with ID starting with 86328f182f7db62c6133733a4a4b9171f479c5b134c11250bb1839234544f485 not found: ID does not exist" containerID="86328f182f7db62c6133733a4a4b9171f479c5b134c11250bb1839234544f485" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.693769 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86328f182f7db62c6133733a4a4b9171f479c5b134c11250bb1839234544f485"} err="failed to get container status \"86328f182f7db62c6133733a4a4b9171f479c5b134c11250bb1839234544f485\": rpc error: code = NotFound desc = could not find container \"86328f182f7db62c6133733a4a4b9171f479c5b134c11250bb1839234544f485\": container with ID starting with 86328f182f7db62c6133733a4a4b9171f479c5b134c11250bb1839234544f485 not found: ID does not exist" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.693782 4685 scope.go:117] "RemoveContainer" containerID="34752fc66f7ecfa5c0a9a761bc6a69d21ab1fa332ecd79948cb6b1ea34a8a8b8" Mar 21 03:59:03 crc kubenswrapper[4685]: E0321 03:59:03.694189 4685 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34752fc66f7ecfa5c0a9a761bc6a69d21ab1fa332ecd79948cb6b1ea34a8a8b8\": container with ID starting with 34752fc66f7ecfa5c0a9a761bc6a69d21ab1fa332ecd79948cb6b1ea34a8a8b8 not found: ID does not exist" containerID="34752fc66f7ecfa5c0a9a761bc6a69d21ab1fa332ecd79948cb6b1ea34a8a8b8" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.694224 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34752fc66f7ecfa5c0a9a761bc6a69d21ab1fa332ecd79948cb6b1ea34a8a8b8"} err="failed to get container status \"34752fc66f7ecfa5c0a9a761bc6a69d21ab1fa332ecd79948cb6b1ea34a8a8b8\": rpc error: code = NotFound desc = could not find container \"34752fc66f7ecfa5c0a9a761bc6a69d21ab1fa332ecd79948cb6b1ea34a8a8b8\": container with ID starting with 34752fc66f7ecfa5c0a9a761bc6a69d21ab1fa332ecd79948cb6b1ea34a8a8b8 not found: ID does not exist" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.694246 4685 scope.go:117] "RemoveContainer" containerID="2215b819a38926d6f34fede86a32762a942f00bc552ca9c0b8df5f17ed47c10a" Mar 21 03:59:03 crc kubenswrapper[4685]: E0321 03:59:03.694699 4685 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2215b819a38926d6f34fede86a32762a942f00bc552ca9c0b8df5f17ed47c10a\": container with ID starting with 2215b819a38926d6f34fede86a32762a942f00bc552ca9c0b8df5f17ed47c10a not found: ID does not exist" containerID="2215b819a38926d6f34fede86a32762a942f00bc552ca9c0b8df5f17ed47c10a" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.694795 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2215b819a38926d6f34fede86a32762a942f00bc552ca9c0b8df5f17ed47c10a"} err="failed to get container status \"2215b819a38926d6f34fede86a32762a942f00bc552ca9c0b8df5f17ed47c10a\": rpc error: code = NotFound desc = could not find container \"2215b819a38926d6f34fede86a32762a942f00bc552ca9c0b8df5f17ed47c10a\": container with ID starting with 2215b819a38926d6f34fede86a32762a942f00bc552ca9c0b8df5f17ed47c10a not found: ID does not exist" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.694901 4685 scope.go:117] "RemoveContainer" containerID="beb9850d14f3c89154f1960550eed4073643c6b70832bb3958176ac924f5a9fa" Mar 21 03:59:03 crc kubenswrapper[4685]: E0321 03:59:03.695221 4685 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"beb9850d14f3c89154f1960550eed4073643c6b70832bb3958176ac924f5a9fa\": container with ID starting with beb9850d14f3c89154f1960550eed4073643c6b70832bb3958176ac924f5a9fa not found: ID does not exist" containerID="beb9850d14f3c89154f1960550eed4073643c6b70832bb3958176ac924f5a9fa" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.695246 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"beb9850d14f3c89154f1960550eed4073643c6b70832bb3958176ac924f5a9fa"} err="failed to get container status \"beb9850d14f3c89154f1960550eed4073643c6b70832bb3958176ac924f5a9fa\": rpc error: code = NotFound desc = could not find container \"beb9850d14f3c89154f1960550eed4073643c6b70832bb3958176ac924f5a9fa\": container with ID starting with beb9850d14f3c89154f1960550eed4073643c6b70832bb3958176ac924f5a9fa not found: ID does not exist" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.695261 4685 scope.go:117] "RemoveContainer" containerID="a4ad07f9e8bc73e15364d59e6baf04c54550518537bac8cbfcf180b423a1da4f" Mar 21 03:59:03 crc kubenswrapper[4685]: E0321 03:59:03.695527 4685 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4ad07f9e8bc73e15364d59e6baf04c54550518537bac8cbfcf180b423a1da4f\": container with ID starting with a4ad07f9e8bc73e15364d59e6baf04c54550518537bac8cbfcf180b423a1da4f not found: ID does not exist" containerID="a4ad07f9e8bc73e15364d59e6baf04c54550518537bac8cbfcf180b423a1da4f" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.695608 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4ad07f9e8bc73e15364d59e6baf04c54550518537bac8cbfcf180b423a1da4f"} err="failed to get container status \"a4ad07f9e8bc73e15364d59e6baf04c54550518537bac8cbfcf180b423a1da4f\": rpc error: code = NotFound desc = could not find container \"a4ad07f9e8bc73e15364d59e6baf04c54550518537bac8cbfcf180b423a1da4f\": container with ID starting with a4ad07f9e8bc73e15364d59e6baf04c54550518537bac8cbfcf180b423a1da4f not found: ID does not exist" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.695667 4685 scope.go:117] "RemoveContainer" containerID="d89981c8d56dfb581289dc794c8402f5fde866fd6948c2c0a1940eb3ba99a320" Mar 21 03:59:03 crc kubenswrapper[4685]: E0321 03:59:03.695924 4685 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d89981c8d56dfb581289dc794c8402f5fde866fd6948c2c0a1940eb3ba99a320\": container with ID starting with d89981c8d56dfb581289dc794c8402f5fde866fd6948c2c0a1940eb3ba99a320 not found: ID does not exist" containerID="d89981c8d56dfb581289dc794c8402f5fde866fd6948c2c0a1940eb3ba99a320" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.696005 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d89981c8d56dfb581289dc794c8402f5fde866fd6948c2c0a1940eb3ba99a320"} err="failed to get container status \"d89981c8d56dfb581289dc794c8402f5fde866fd6948c2c0a1940eb3ba99a320\": rpc error: code = NotFound desc = could not find container \"d89981c8d56dfb581289dc794c8402f5fde866fd6948c2c0a1940eb3ba99a320\": container with ID starting with d89981c8d56dfb581289dc794c8402f5fde866fd6948c2c0a1940eb3ba99a320 not found: ID does not exist" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.696063 4685 scope.go:117] "RemoveContainer" containerID="473ea0e8600e10ea2a695b213d69795c56c1a399cbc5dc82ddb34f7261a034e5" Mar 21 03:59:03 crc kubenswrapper[4685]: E0321 03:59:03.696360 4685 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"473ea0e8600e10ea2a695b213d69795c56c1a399cbc5dc82ddb34f7261a034e5\": container with ID starting with 473ea0e8600e10ea2a695b213d69795c56c1a399cbc5dc82ddb34f7261a034e5 not found: ID does not exist" containerID="473ea0e8600e10ea2a695b213d69795c56c1a399cbc5dc82ddb34f7261a034e5" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.696387 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"473ea0e8600e10ea2a695b213d69795c56c1a399cbc5dc82ddb34f7261a034e5"} err="failed to get container status \"473ea0e8600e10ea2a695b213d69795c56c1a399cbc5dc82ddb34f7261a034e5\": rpc error: code = NotFound desc = could not find container \"473ea0e8600e10ea2a695b213d69795c56c1a399cbc5dc82ddb34f7261a034e5\": container with ID starting with 473ea0e8600e10ea2a695b213d69795c56c1a399cbc5dc82ddb34f7261a034e5 not found: ID does not exist" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.696404 4685 scope.go:117] "RemoveContainer" containerID="246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a" Mar 21 03:59:03 crc kubenswrapper[4685]: E0321 03:59:03.696868 4685 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a\": container with ID starting with 246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a not found: ID does not exist" containerID="246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.696894 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a"} err="failed to get container status \"246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a\": rpc error: code = NotFound desc = could not find container \"246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a\": container with ID starting with 246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a not found: ID does not exist" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.696910 4685 scope.go:117] "RemoveContainer" containerID="e3b8a82bebc823be171b9425db7ef881f4918b904d172973b06ee5b15fe79a15" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.697129 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3b8a82bebc823be171b9425db7ef881f4918b904d172973b06ee5b15fe79a15"} err="failed to get container status \"e3b8a82bebc823be171b9425db7ef881f4918b904d172973b06ee5b15fe79a15\": rpc error: code = NotFound desc = could not find container \"e3b8a82bebc823be171b9425db7ef881f4918b904d172973b06ee5b15fe79a15\": container with ID starting with e3b8a82bebc823be171b9425db7ef881f4918b904d172973b06ee5b15fe79a15 not found: ID does not exist" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.697149 4685 scope.go:117] "RemoveContainer" containerID="f4c5056ea79326435f55aafe0fc9580a47c394a0d04559cd50b5eb720b2f599a" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.697353 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4c5056ea79326435f55aafe0fc9580a47c394a0d04559cd50b5eb720b2f599a"} err="failed to get container status \"f4c5056ea79326435f55aafe0fc9580a47c394a0d04559cd50b5eb720b2f599a\": rpc error: code = NotFound desc = could not find container \"f4c5056ea79326435f55aafe0fc9580a47c394a0d04559cd50b5eb720b2f599a\": container with ID starting with f4c5056ea79326435f55aafe0fc9580a47c394a0d04559cd50b5eb720b2f599a not found: ID does not exist" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.697437 4685 scope.go:117] "RemoveContainer" containerID="86328f182f7db62c6133733a4a4b9171f479c5b134c11250bb1839234544f485" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.697735 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86328f182f7db62c6133733a4a4b9171f479c5b134c11250bb1839234544f485"} err="failed to get container status \"86328f182f7db62c6133733a4a4b9171f479c5b134c11250bb1839234544f485\": rpc error: code = NotFound desc = could not find container \"86328f182f7db62c6133733a4a4b9171f479c5b134c11250bb1839234544f485\": container with ID starting with 86328f182f7db62c6133733a4a4b9171f479c5b134c11250bb1839234544f485 not found: ID does not exist" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.697908 4685 scope.go:117] "RemoveContainer" containerID="34752fc66f7ecfa5c0a9a761bc6a69d21ab1fa332ecd79948cb6b1ea34a8a8b8" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.698263 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34752fc66f7ecfa5c0a9a761bc6a69d21ab1fa332ecd79948cb6b1ea34a8a8b8"} err="failed to get container status \"34752fc66f7ecfa5c0a9a761bc6a69d21ab1fa332ecd79948cb6b1ea34a8a8b8\": rpc error: code = NotFound desc = could not find container \"34752fc66f7ecfa5c0a9a761bc6a69d21ab1fa332ecd79948cb6b1ea34a8a8b8\": container with ID starting with 34752fc66f7ecfa5c0a9a761bc6a69d21ab1fa332ecd79948cb6b1ea34a8a8b8 not found: ID does not exist" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.698378 4685 scope.go:117] "RemoveContainer" containerID="2215b819a38926d6f34fede86a32762a942f00bc552ca9c0b8df5f17ed47c10a" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.698635 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2215b819a38926d6f34fede86a32762a942f00bc552ca9c0b8df5f17ed47c10a"} err="failed to get container status \"2215b819a38926d6f34fede86a32762a942f00bc552ca9c0b8df5f17ed47c10a\": rpc error: code = NotFound desc = could not find container \"2215b819a38926d6f34fede86a32762a942f00bc552ca9c0b8df5f17ed47c10a\": container with ID starting with 2215b819a38926d6f34fede86a32762a942f00bc552ca9c0b8df5f17ed47c10a not found: ID does not exist" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.698657 4685 scope.go:117] "RemoveContainer" containerID="beb9850d14f3c89154f1960550eed4073643c6b70832bb3958176ac924f5a9fa" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.698943 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"beb9850d14f3c89154f1960550eed4073643c6b70832bb3958176ac924f5a9fa"} err="failed to get container status \"beb9850d14f3c89154f1960550eed4073643c6b70832bb3958176ac924f5a9fa\": rpc error: code = NotFound desc = could not find container \"beb9850d14f3c89154f1960550eed4073643c6b70832bb3958176ac924f5a9fa\": container with ID starting with beb9850d14f3c89154f1960550eed4073643c6b70832bb3958176ac924f5a9fa not found: ID does not exist" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.698963 4685 scope.go:117] "RemoveContainer" containerID="a4ad07f9e8bc73e15364d59e6baf04c54550518537bac8cbfcf180b423a1da4f" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.699225 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4ad07f9e8bc73e15364d59e6baf04c54550518537bac8cbfcf180b423a1da4f"} err="failed to get container status \"a4ad07f9e8bc73e15364d59e6baf04c54550518537bac8cbfcf180b423a1da4f\": rpc error: code = NotFound desc = could not find container \"a4ad07f9e8bc73e15364d59e6baf04c54550518537bac8cbfcf180b423a1da4f\": container with ID starting with a4ad07f9e8bc73e15364d59e6baf04c54550518537bac8cbfcf180b423a1da4f not found: ID does not exist" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.699298 4685 scope.go:117] "RemoveContainer" containerID="d89981c8d56dfb581289dc794c8402f5fde866fd6948c2c0a1940eb3ba99a320" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.699609 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d89981c8d56dfb581289dc794c8402f5fde866fd6948c2c0a1940eb3ba99a320"} err="failed to get container status \"d89981c8d56dfb581289dc794c8402f5fde866fd6948c2c0a1940eb3ba99a320\": rpc error: code = NotFound desc = could not find container \"d89981c8d56dfb581289dc794c8402f5fde866fd6948c2c0a1940eb3ba99a320\": container with ID starting with d89981c8d56dfb581289dc794c8402f5fde866fd6948c2c0a1940eb3ba99a320 not found: ID does not exist" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.699726 4685 scope.go:117] "RemoveContainer" containerID="473ea0e8600e10ea2a695b213d69795c56c1a399cbc5dc82ddb34f7261a034e5" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.700258 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"473ea0e8600e10ea2a695b213d69795c56c1a399cbc5dc82ddb34f7261a034e5"} err="failed to get container status \"473ea0e8600e10ea2a695b213d69795c56c1a399cbc5dc82ddb34f7261a034e5\": rpc error: code = NotFound desc = could not find container \"473ea0e8600e10ea2a695b213d69795c56c1a399cbc5dc82ddb34f7261a034e5\": container with ID starting with 473ea0e8600e10ea2a695b213d69795c56c1a399cbc5dc82ddb34f7261a034e5 not found: ID does not exist" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.700353 4685 scope.go:117] "RemoveContainer" containerID="246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.700607 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a"} err="failed to get container status \"246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a\": rpc error: code = NotFound desc = could not find container \"246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a\": container with ID starting with 246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a not found: ID does not exist" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.700688 4685 scope.go:117] "RemoveContainer" containerID="e3b8a82bebc823be171b9425db7ef881f4918b904d172973b06ee5b15fe79a15" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.700965 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3b8a82bebc823be171b9425db7ef881f4918b904d172973b06ee5b15fe79a15"} err="failed to get container status \"e3b8a82bebc823be171b9425db7ef881f4918b904d172973b06ee5b15fe79a15\": rpc error: code = NotFound desc = could not find container \"e3b8a82bebc823be171b9425db7ef881f4918b904d172973b06ee5b15fe79a15\": container with ID starting with e3b8a82bebc823be171b9425db7ef881f4918b904d172973b06ee5b15fe79a15 not found: ID does not exist" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.701039 4685 scope.go:117] "RemoveContainer" containerID="f4c5056ea79326435f55aafe0fc9580a47c394a0d04559cd50b5eb720b2f599a" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.701313 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4c5056ea79326435f55aafe0fc9580a47c394a0d04559cd50b5eb720b2f599a"} err="failed to get container status \"f4c5056ea79326435f55aafe0fc9580a47c394a0d04559cd50b5eb720b2f599a\": rpc error: code = NotFound desc = could not find container \"f4c5056ea79326435f55aafe0fc9580a47c394a0d04559cd50b5eb720b2f599a\": container with ID starting with f4c5056ea79326435f55aafe0fc9580a47c394a0d04559cd50b5eb720b2f599a not found: ID does not exist" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.701395 4685 scope.go:117] "RemoveContainer" containerID="86328f182f7db62c6133733a4a4b9171f479c5b134c11250bb1839234544f485" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.701664 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86328f182f7db62c6133733a4a4b9171f479c5b134c11250bb1839234544f485"} err="failed to get container status \"86328f182f7db62c6133733a4a4b9171f479c5b134c11250bb1839234544f485\": rpc error: code = NotFound desc = could not find container \"86328f182f7db62c6133733a4a4b9171f479c5b134c11250bb1839234544f485\": container with ID starting with 86328f182f7db62c6133733a4a4b9171f479c5b134c11250bb1839234544f485 not found: ID does not exist" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.701740 4685 scope.go:117] "RemoveContainer" containerID="34752fc66f7ecfa5c0a9a761bc6a69d21ab1fa332ecd79948cb6b1ea34a8a8b8" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.702015 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34752fc66f7ecfa5c0a9a761bc6a69d21ab1fa332ecd79948cb6b1ea34a8a8b8"} err="failed to get container status \"34752fc66f7ecfa5c0a9a761bc6a69d21ab1fa332ecd79948cb6b1ea34a8a8b8\": rpc error: code = NotFound desc = could not find container \"34752fc66f7ecfa5c0a9a761bc6a69d21ab1fa332ecd79948cb6b1ea34a8a8b8\": container with ID starting with 34752fc66f7ecfa5c0a9a761bc6a69d21ab1fa332ecd79948cb6b1ea34a8a8b8 not found: ID does not exist" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.702093 4685 scope.go:117] "RemoveContainer" containerID="2215b819a38926d6f34fede86a32762a942f00bc552ca9c0b8df5f17ed47c10a" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.703276 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2215b819a38926d6f34fede86a32762a942f00bc552ca9c0b8df5f17ed47c10a"} err="failed to get container status \"2215b819a38926d6f34fede86a32762a942f00bc552ca9c0b8df5f17ed47c10a\": rpc error: code = NotFound desc = could not find container \"2215b819a38926d6f34fede86a32762a942f00bc552ca9c0b8df5f17ed47c10a\": container with ID starting with 2215b819a38926d6f34fede86a32762a942f00bc552ca9c0b8df5f17ed47c10a not found: ID does not exist" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.703303 4685 scope.go:117] "RemoveContainer" containerID="beb9850d14f3c89154f1960550eed4073643c6b70832bb3958176ac924f5a9fa" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.703956 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"beb9850d14f3c89154f1960550eed4073643c6b70832bb3958176ac924f5a9fa"} err="failed to get container status \"beb9850d14f3c89154f1960550eed4073643c6b70832bb3958176ac924f5a9fa\": rpc error: code = NotFound desc = could not find container \"beb9850d14f3c89154f1960550eed4073643c6b70832bb3958176ac924f5a9fa\": container with ID starting with beb9850d14f3c89154f1960550eed4073643c6b70832bb3958176ac924f5a9fa not found: ID does not exist" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.703992 4685 scope.go:117] "RemoveContainer" containerID="a4ad07f9e8bc73e15364d59e6baf04c54550518537bac8cbfcf180b423a1da4f" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.704238 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4ad07f9e8bc73e15364d59e6baf04c54550518537bac8cbfcf180b423a1da4f"} err="failed to get container status \"a4ad07f9e8bc73e15364d59e6baf04c54550518537bac8cbfcf180b423a1da4f\": rpc error: code = NotFound desc = could not find container \"a4ad07f9e8bc73e15364d59e6baf04c54550518537bac8cbfcf180b423a1da4f\": container with ID starting with a4ad07f9e8bc73e15364d59e6baf04c54550518537bac8cbfcf180b423a1da4f not found: ID does not exist" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.704257 4685 scope.go:117] "RemoveContainer" containerID="d89981c8d56dfb581289dc794c8402f5fde866fd6948c2c0a1940eb3ba99a320" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.704484 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d89981c8d56dfb581289dc794c8402f5fde866fd6948c2c0a1940eb3ba99a320"} err="failed to get container status \"d89981c8d56dfb581289dc794c8402f5fde866fd6948c2c0a1940eb3ba99a320\": rpc error: code = NotFound desc = could not find container \"d89981c8d56dfb581289dc794c8402f5fde866fd6948c2c0a1940eb3ba99a320\": container with ID starting with d89981c8d56dfb581289dc794c8402f5fde866fd6948c2c0a1940eb3ba99a320 not found: ID does not exist" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.704530 4685 scope.go:117] "RemoveContainer" containerID="473ea0e8600e10ea2a695b213d69795c56c1a399cbc5dc82ddb34f7261a034e5" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.704717 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"473ea0e8600e10ea2a695b213d69795c56c1a399cbc5dc82ddb34f7261a034e5"} err="failed to get container status \"473ea0e8600e10ea2a695b213d69795c56c1a399cbc5dc82ddb34f7261a034e5\": rpc error: code = NotFound desc = could not find container \"473ea0e8600e10ea2a695b213d69795c56c1a399cbc5dc82ddb34f7261a034e5\": container with ID starting with 473ea0e8600e10ea2a695b213d69795c56c1a399cbc5dc82ddb34f7261a034e5 not found: ID does not exist" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.704731 4685 scope.go:117] "RemoveContainer" containerID="246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.704909 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a"} err="failed to get container status \"246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a\": rpc error: code = NotFound desc = could not find container \"246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a\": container with ID starting with 246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a not found: ID does not exist" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.704925 4685 scope.go:117] "RemoveContainer" containerID="e3b8a82bebc823be171b9425db7ef881f4918b904d172973b06ee5b15fe79a15" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.705093 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3b8a82bebc823be171b9425db7ef881f4918b904d172973b06ee5b15fe79a15"} err="failed to get container status \"e3b8a82bebc823be171b9425db7ef881f4918b904d172973b06ee5b15fe79a15\": rpc error: code = NotFound desc = could not find container \"e3b8a82bebc823be171b9425db7ef881f4918b904d172973b06ee5b15fe79a15\": container with ID starting with e3b8a82bebc823be171b9425db7ef881f4918b904d172973b06ee5b15fe79a15 not found: ID does not exist" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.705115 4685 scope.go:117] "RemoveContainer" containerID="f4c5056ea79326435f55aafe0fc9580a47c394a0d04559cd50b5eb720b2f599a" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.705321 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4c5056ea79326435f55aafe0fc9580a47c394a0d04559cd50b5eb720b2f599a"} err="failed to get container status \"f4c5056ea79326435f55aafe0fc9580a47c394a0d04559cd50b5eb720b2f599a\": rpc error: code = NotFound desc = could not find container \"f4c5056ea79326435f55aafe0fc9580a47c394a0d04559cd50b5eb720b2f599a\": container with ID starting with f4c5056ea79326435f55aafe0fc9580a47c394a0d04559cd50b5eb720b2f599a not found: ID does not exist" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.705344 4685 scope.go:117] "RemoveContainer" containerID="86328f182f7db62c6133733a4a4b9171f479c5b134c11250bb1839234544f485" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.705542 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86328f182f7db62c6133733a4a4b9171f479c5b134c11250bb1839234544f485"} err="failed to get container status \"86328f182f7db62c6133733a4a4b9171f479c5b134c11250bb1839234544f485\": rpc error: code = NotFound desc = could not find container \"86328f182f7db62c6133733a4a4b9171f479c5b134c11250bb1839234544f485\": container with ID starting with 86328f182f7db62c6133733a4a4b9171f479c5b134c11250bb1839234544f485 not found: ID does not exist" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.705562 4685 scope.go:117] "RemoveContainer" containerID="34752fc66f7ecfa5c0a9a761bc6a69d21ab1fa332ecd79948cb6b1ea34a8a8b8" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.705753 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34752fc66f7ecfa5c0a9a761bc6a69d21ab1fa332ecd79948cb6b1ea34a8a8b8"} err="failed to get container status \"34752fc66f7ecfa5c0a9a761bc6a69d21ab1fa332ecd79948cb6b1ea34a8a8b8\": rpc error: code = NotFound desc = could not find container \"34752fc66f7ecfa5c0a9a761bc6a69d21ab1fa332ecd79948cb6b1ea34a8a8b8\": container with ID starting with 34752fc66f7ecfa5c0a9a761bc6a69d21ab1fa332ecd79948cb6b1ea34a8a8b8 not found: ID does not exist" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.705773 4685 scope.go:117] "RemoveContainer" containerID="2215b819a38926d6f34fede86a32762a942f00bc552ca9c0b8df5f17ed47c10a" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.705978 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2215b819a38926d6f34fede86a32762a942f00bc552ca9c0b8df5f17ed47c10a"} err="failed to get container status \"2215b819a38926d6f34fede86a32762a942f00bc552ca9c0b8df5f17ed47c10a\": rpc error: code = NotFound desc = could not find container \"2215b819a38926d6f34fede86a32762a942f00bc552ca9c0b8df5f17ed47c10a\": container with ID starting with 2215b819a38926d6f34fede86a32762a942f00bc552ca9c0b8df5f17ed47c10a not found: ID does not exist" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.705997 4685 scope.go:117] "RemoveContainer" containerID="beb9850d14f3c89154f1960550eed4073643c6b70832bb3958176ac924f5a9fa" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.706188 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"beb9850d14f3c89154f1960550eed4073643c6b70832bb3958176ac924f5a9fa"} err="failed to get container status \"beb9850d14f3c89154f1960550eed4073643c6b70832bb3958176ac924f5a9fa\": rpc error: code = NotFound desc = could not find container \"beb9850d14f3c89154f1960550eed4073643c6b70832bb3958176ac924f5a9fa\": container with ID starting with beb9850d14f3c89154f1960550eed4073643c6b70832bb3958176ac924f5a9fa not found: ID does not exist" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.706201 4685 scope.go:117] "RemoveContainer" containerID="a4ad07f9e8bc73e15364d59e6baf04c54550518537bac8cbfcf180b423a1da4f" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.706412 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4ad07f9e8bc73e15364d59e6baf04c54550518537bac8cbfcf180b423a1da4f"} err="failed to get container status \"a4ad07f9e8bc73e15364d59e6baf04c54550518537bac8cbfcf180b423a1da4f\": rpc error: code = NotFound desc = could not find container \"a4ad07f9e8bc73e15364d59e6baf04c54550518537bac8cbfcf180b423a1da4f\": container with ID starting with a4ad07f9e8bc73e15364d59e6baf04c54550518537bac8cbfcf180b423a1da4f not found: ID does not exist" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.706429 4685 scope.go:117] "RemoveContainer" containerID="d89981c8d56dfb581289dc794c8402f5fde866fd6948c2c0a1940eb3ba99a320" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.706614 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d89981c8d56dfb581289dc794c8402f5fde866fd6948c2c0a1940eb3ba99a320"} err="failed to get container status \"d89981c8d56dfb581289dc794c8402f5fde866fd6948c2c0a1940eb3ba99a320\": rpc error: code = NotFound desc = could not find container \"d89981c8d56dfb581289dc794c8402f5fde866fd6948c2c0a1940eb3ba99a320\": container with ID starting with d89981c8d56dfb581289dc794c8402f5fde866fd6948c2c0a1940eb3ba99a320 not found: ID does not exist" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.706628 4685 scope.go:117] "RemoveContainer" containerID="473ea0e8600e10ea2a695b213d69795c56c1a399cbc5dc82ddb34f7261a034e5" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.706812 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"473ea0e8600e10ea2a695b213d69795c56c1a399cbc5dc82ddb34f7261a034e5"} err="failed to get container status \"473ea0e8600e10ea2a695b213d69795c56c1a399cbc5dc82ddb34f7261a034e5\": rpc error: code = NotFound desc = could not find container \"473ea0e8600e10ea2a695b213d69795c56c1a399cbc5dc82ddb34f7261a034e5\": container with ID starting with 473ea0e8600e10ea2a695b213d69795c56c1a399cbc5dc82ddb34f7261a034e5 not found: ID does not exist" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.706827 4685 scope.go:117] "RemoveContainer" containerID="246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a" Mar 21 03:59:03 crc kubenswrapper[4685]: I0321 03:59:03.707028 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a"} err="failed to get container status \"246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a\": rpc error: code = NotFound desc = could not find container \"246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a\": container with ID starting with 246def988ffeaec6d82733cd17caa15912872a162f4f7db5ac6e53a7127e0a4a not found: ID does not exist" Mar 21 03:59:04 crc kubenswrapper[4685]: I0321 03:59:04.314299 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08dfc393-0ddb-4bde-9b1f-2a48549f4549" path="/var/lib/kubelet/pods/08dfc393-0ddb-4bde-9b1f-2a48549f4549/volumes" Mar 21 03:59:04 crc kubenswrapper[4685]: I0321 03:59:04.472044 4685 generic.go:334] "Generic (PLEG): container finished" podID="c6448261-a026-47ee-89ca-dcb0374602b8" containerID="3e78cc92c5acc6c8ddd18d6912de7fe4c61b207ab450cdd885fe1fbdc608e74c" exitCode=0 Mar 21 03:59:04 crc kubenswrapper[4685]: I0321 03:59:04.472161 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pb8hx" event={"ID":"c6448261-a026-47ee-89ca-dcb0374602b8","Type":"ContainerDied","Data":"3e78cc92c5acc6c8ddd18d6912de7fe4c61b207ab450cdd885fe1fbdc608e74c"} Mar 21 03:59:04 crc kubenswrapper[4685]: I0321 03:59:04.479364 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7jcm2_cd9b1743-6b69-46d3-a429-6f83bf43317a/kube-multus/2.log" Mar 21 03:59:05 crc kubenswrapper[4685]: I0321 03:59:05.490197 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pb8hx" event={"ID":"c6448261-a026-47ee-89ca-dcb0374602b8","Type":"ContainerStarted","Data":"fd19621d40efd37adf965cfbc6270cab841b19f32460e1018110b30cf42b6cbc"} Mar 21 03:59:05 crc kubenswrapper[4685]: I0321 03:59:05.490530 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pb8hx" event={"ID":"c6448261-a026-47ee-89ca-dcb0374602b8","Type":"ContainerStarted","Data":"3a9453c468a016147c8bef634295b0a307871c3ac6b1f45690a4ce310e76573c"} Mar 21 03:59:05 crc kubenswrapper[4685]: I0321 03:59:05.490544 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pb8hx" event={"ID":"c6448261-a026-47ee-89ca-dcb0374602b8","Type":"ContainerStarted","Data":"48a6b8479cffcb39bb2a2f12cf34d770aff293a6ec193b06eb0aaec956c09e5b"} Mar 21 03:59:05 crc kubenswrapper[4685]: I0321 03:59:05.490558 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pb8hx" event={"ID":"c6448261-a026-47ee-89ca-dcb0374602b8","Type":"ContainerStarted","Data":"ab0ef4382a842e27aea3a925d717464fd8da64dcc54a623a9b108875730cc4cb"} Mar 21 03:59:05 crc kubenswrapper[4685]: I0321 03:59:05.490568 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pb8hx" event={"ID":"c6448261-a026-47ee-89ca-dcb0374602b8","Type":"ContainerStarted","Data":"0ffe824d505ff7a0e7d036efbb811efcc0674f0a586023cb8dc38e9de8dbe83a"} Mar 21 03:59:05 crc kubenswrapper[4685]: I0321 03:59:05.490578 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pb8hx" event={"ID":"c6448261-a026-47ee-89ca-dcb0374602b8","Type":"ContainerStarted","Data":"3d8d53d87d12e207f8bd99042653a6a64707a24636a869a1eac9c727f8d8bb3f"} Mar 21 03:59:08 crc kubenswrapper[4685]: I0321 03:59:08.524132 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pb8hx" event={"ID":"c6448261-a026-47ee-89ca-dcb0374602b8","Type":"ContainerStarted","Data":"da77ff565ebdb58fdbdf08f6b5e50b238320d2620b71b176a3fe80d71acb38c1"} Mar 21 03:59:09 crc kubenswrapper[4685]: I0321 03:59:09.685489 4685 patch_prober.go:28] interesting pod/machine-config-daemon-7r9cg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 03:59:09 crc kubenswrapper[4685]: I0321 03:59:09.686033 4685 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" podUID="cea46fe2-4e41-43ab-a069-cb30fb4e732c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 03:59:10 crc kubenswrapper[4685]: I0321 03:59:10.538549 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pb8hx" event={"ID":"c6448261-a026-47ee-89ca-dcb0374602b8","Type":"ContainerStarted","Data":"aab460b7cfe9694a63a8a683f4188d42f236498ccea7fe801fcf99cb12d71749"} Mar 21 03:59:10 crc kubenswrapper[4685]: I0321 03:59:10.539060 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pb8hx" Mar 21 03:59:10 crc kubenswrapper[4685]: I0321 03:59:10.539076 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pb8hx" Mar 21 03:59:10 crc kubenswrapper[4685]: I0321 03:59:10.539087 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pb8hx" Mar 21 03:59:10 crc kubenswrapper[4685]: I0321 03:59:10.563974 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pb8hx" Mar 21 03:59:10 crc kubenswrapper[4685]: I0321 03:59:10.567386 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-pb8hx" podStartSLOduration=7.56736953 podStartE2EDuration="7.56736953s" podCreationTimestamp="2026-03-21 03:59:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 03:59:10.561333444 +0000 UTC m=+783.038402236" watchObservedRunningTime="2026-03-21 03:59:10.56736953 +0000 UTC m=+783.044438322" Mar 21 03:59:10 crc kubenswrapper[4685]: I0321 03:59:10.572209 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pb8hx" Mar 21 03:59:15 crc kubenswrapper[4685]: I0321 03:59:15.300391 4685 scope.go:117] "RemoveContainer" containerID="aeb2e6d1910f6dc402503b823272d26ca9f0ffb3b41c3137e50a4345d1710170" Mar 21 03:59:15 crc kubenswrapper[4685]: E0321 03:59:15.301048 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-7jcm2_openshift-multus(cd9b1743-6b69-46d3-a429-6f83bf43317a)\"" pod="openshift-multus/multus-7jcm2" podUID="cd9b1743-6b69-46d3-a429-6f83bf43317a" Mar 21 03:59:21 crc kubenswrapper[4685]: I0321 03:59:21.264424 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rf7bs"] Mar 21 03:59:21 crc kubenswrapper[4685]: I0321 03:59:21.265986 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rf7bs" Mar 21 03:59:21 crc kubenswrapper[4685]: I0321 03:59:21.268930 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 21 03:59:21 crc kubenswrapper[4685]: I0321 03:59:21.273718 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rf7bs"] Mar 21 03:59:21 crc kubenswrapper[4685]: I0321 03:59:21.436871 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8cbeffba-f99c-488d-b0db-1cf3b8e31823-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rf7bs\" (UID: \"8cbeffba-f99c-488d-b0db-1cf3b8e31823\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rf7bs" Mar 21 03:59:21 crc kubenswrapper[4685]: I0321 03:59:21.437216 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pl8nh\" (UniqueName: \"kubernetes.io/projected/8cbeffba-f99c-488d-b0db-1cf3b8e31823-kube-api-access-pl8nh\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rf7bs\" (UID: \"8cbeffba-f99c-488d-b0db-1cf3b8e31823\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rf7bs" Mar 21 03:59:21 crc kubenswrapper[4685]: I0321 03:59:21.437387 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8cbeffba-f99c-488d-b0db-1cf3b8e31823-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rf7bs\" (UID: \"8cbeffba-f99c-488d-b0db-1cf3b8e31823\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rf7bs" Mar 21 03:59:21 crc kubenswrapper[4685]: I0321 03:59:21.538307 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8cbeffba-f99c-488d-b0db-1cf3b8e31823-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rf7bs\" (UID: \"8cbeffba-f99c-488d-b0db-1cf3b8e31823\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rf7bs" Mar 21 03:59:21 crc kubenswrapper[4685]: I0321 03:59:21.538638 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8cbeffba-f99c-488d-b0db-1cf3b8e31823-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rf7bs\" (UID: \"8cbeffba-f99c-488d-b0db-1cf3b8e31823\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rf7bs" Mar 21 03:59:21 crc kubenswrapper[4685]: I0321 03:59:21.538886 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pl8nh\" (UniqueName: \"kubernetes.io/projected/8cbeffba-f99c-488d-b0db-1cf3b8e31823-kube-api-access-pl8nh\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rf7bs\" (UID: \"8cbeffba-f99c-488d-b0db-1cf3b8e31823\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rf7bs" Mar 21 03:59:21 crc kubenswrapper[4685]: I0321 03:59:21.539414 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8cbeffba-f99c-488d-b0db-1cf3b8e31823-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rf7bs\" (UID: \"8cbeffba-f99c-488d-b0db-1cf3b8e31823\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rf7bs" Mar 21 03:59:21 crc kubenswrapper[4685]: I0321 03:59:21.539454 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8cbeffba-f99c-488d-b0db-1cf3b8e31823-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rf7bs\" (UID: \"8cbeffba-f99c-488d-b0db-1cf3b8e31823\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rf7bs" Mar 21 03:59:21 crc kubenswrapper[4685]: I0321 03:59:21.574158 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pl8nh\" (UniqueName: \"kubernetes.io/projected/8cbeffba-f99c-488d-b0db-1cf3b8e31823-kube-api-access-pl8nh\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rf7bs\" (UID: \"8cbeffba-f99c-488d-b0db-1cf3b8e31823\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rf7bs" Mar 21 03:59:21 crc kubenswrapper[4685]: I0321 03:59:21.579970 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rf7bs" Mar 21 03:59:21 crc kubenswrapper[4685]: E0321 03:59:21.610596 4685 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rf7bs_openshift-marketplace_8cbeffba-f99c-488d-b0db-1cf3b8e31823_0(fe5ee4b565d372f9c09fd79248d98c5be37024d39235349bbf17272d44ae33da): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 03:59:21 crc kubenswrapper[4685]: E0321 03:59:21.610677 4685 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rf7bs_openshift-marketplace_8cbeffba-f99c-488d-b0db-1cf3b8e31823_0(fe5ee4b565d372f9c09fd79248d98c5be37024d39235349bbf17272d44ae33da): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rf7bs" Mar 21 03:59:21 crc kubenswrapper[4685]: E0321 03:59:21.610709 4685 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rf7bs_openshift-marketplace_8cbeffba-f99c-488d-b0db-1cf3b8e31823_0(fe5ee4b565d372f9c09fd79248d98c5be37024d39235349bbf17272d44ae33da): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rf7bs" Mar 21 03:59:21 crc kubenswrapper[4685]: E0321 03:59:21.610758 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rf7bs_openshift-marketplace(8cbeffba-f99c-488d-b0db-1cf3b8e31823)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rf7bs_openshift-marketplace(8cbeffba-f99c-488d-b0db-1cf3b8e31823)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rf7bs_openshift-marketplace_8cbeffba-f99c-488d-b0db-1cf3b8e31823_0(fe5ee4b565d372f9c09fd79248d98c5be37024d39235349bbf17272d44ae33da): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rf7bs" podUID="8cbeffba-f99c-488d-b0db-1cf3b8e31823" Mar 21 03:59:22 crc kubenswrapper[4685]: I0321 03:59:22.619088 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rf7bs" Mar 21 03:59:22 crc kubenswrapper[4685]: I0321 03:59:22.620105 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rf7bs" Mar 21 03:59:22 crc kubenswrapper[4685]: E0321 03:59:22.655731 4685 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rf7bs_openshift-marketplace_8cbeffba-f99c-488d-b0db-1cf3b8e31823_0(38b76dd84ac074baf912ec57049f8e37d7a9d3ea143b22e374b13199765e0a32): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 03:59:22 crc kubenswrapper[4685]: E0321 03:59:22.655826 4685 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rf7bs_openshift-marketplace_8cbeffba-f99c-488d-b0db-1cf3b8e31823_0(38b76dd84ac074baf912ec57049f8e37d7a9d3ea143b22e374b13199765e0a32): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rf7bs" Mar 21 03:59:22 crc kubenswrapper[4685]: E0321 03:59:22.655894 4685 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rf7bs_openshift-marketplace_8cbeffba-f99c-488d-b0db-1cf3b8e31823_0(38b76dd84ac074baf912ec57049f8e37d7a9d3ea143b22e374b13199765e0a32): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rf7bs" Mar 21 03:59:22 crc kubenswrapper[4685]: E0321 03:59:22.655977 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rf7bs_openshift-marketplace(8cbeffba-f99c-488d-b0db-1cf3b8e31823)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rf7bs_openshift-marketplace(8cbeffba-f99c-488d-b0db-1cf3b8e31823)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rf7bs_openshift-marketplace_8cbeffba-f99c-488d-b0db-1cf3b8e31823_0(38b76dd84ac074baf912ec57049f8e37d7a9d3ea143b22e374b13199765e0a32): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rf7bs" podUID="8cbeffba-f99c-488d-b0db-1cf3b8e31823" Mar 21 03:59:30 crc kubenswrapper[4685]: I0321 03:59:30.300767 4685 scope.go:117] "RemoveContainer" containerID="aeb2e6d1910f6dc402503b823272d26ca9f0ffb3b41c3137e50a4345d1710170" Mar 21 03:59:30 crc kubenswrapper[4685]: I0321 03:59:30.663996 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7jcm2_cd9b1743-6b69-46d3-a429-6f83bf43317a/kube-multus/2.log" Mar 21 03:59:30 crc kubenswrapper[4685]: I0321 03:59:30.664198 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7jcm2" event={"ID":"cd9b1743-6b69-46d3-a429-6f83bf43317a","Type":"ContainerStarted","Data":"810969a0cc83c49ba24922ddee94fe60d9ad3ea647fa16848ecbd3c5258e1df0"} Mar 21 03:59:33 crc kubenswrapper[4685]: I0321 03:59:33.396990 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pb8hx" Mar 21 03:59:36 crc kubenswrapper[4685]: I0321 03:59:36.300069 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rf7bs" Mar 21 03:59:36 crc kubenswrapper[4685]: I0321 03:59:36.300636 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rf7bs" Mar 21 03:59:36 crc kubenswrapper[4685]: I0321 03:59:36.507096 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rf7bs"] Mar 21 03:59:36 crc kubenswrapper[4685]: I0321 03:59:36.702432 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rf7bs" event={"ID":"8cbeffba-f99c-488d-b0db-1cf3b8e31823","Type":"ContainerStarted","Data":"6a8666108f667113874fb76528ff5890bc6376dfacc0fbce51c3186c68a05700"} Mar 21 03:59:36 crc kubenswrapper[4685]: I0321 03:59:36.702476 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rf7bs" event={"ID":"8cbeffba-f99c-488d-b0db-1cf3b8e31823","Type":"ContainerStarted","Data":"45c2796d24be366aea6baddbe6b145ed879dfabf9537e2bf399b1237c8d31bd0"} Mar 21 03:59:37 crc kubenswrapper[4685]: I0321 03:59:37.719674 4685 generic.go:334] "Generic (PLEG): container finished" podID="8cbeffba-f99c-488d-b0db-1cf3b8e31823" containerID="6a8666108f667113874fb76528ff5890bc6376dfacc0fbce51c3186c68a05700" exitCode=0 Mar 21 03:59:37 crc kubenswrapper[4685]: I0321 03:59:37.719754 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rf7bs" event={"ID":"8cbeffba-f99c-488d-b0db-1cf3b8e31823","Type":"ContainerDied","Data":"6a8666108f667113874fb76528ff5890bc6376dfacc0fbce51c3186c68a05700"} Mar 21 03:59:39 crc kubenswrapper[4685]: I0321 03:59:39.685404 4685 patch_prober.go:28] interesting pod/machine-config-daemon-7r9cg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 03:59:39 crc kubenswrapper[4685]: I0321 03:59:39.685473 4685 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" podUID="cea46fe2-4e41-43ab-a069-cb30fb4e732c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 03:59:39 crc kubenswrapper[4685]: I0321 03:59:39.737193 4685 generic.go:334] "Generic (PLEG): container finished" podID="8cbeffba-f99c-488d-b0db-1cf3b8e31823" containerID="ed10e1f5e32b22c6adfb8aa682d92ddf450638f77b1d31fb011e06a6d76bdadd" exitCode=0 Mar 21 03:59:39 crc kubenswrapper[4685]: I0321 03:59:39.737241 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rf7bs" event={"ID":"8cbeffba-f99c-488d-b0db-1cf3b8e31823","Type":"ContainerDied","Data":"ed10e1f5e32b22c6adfb8aa682d92ddf450638f77b1d31fb011e06a6d76bdadd"} Mar 21 03:59:40 crc kubenswrapper[4685]: I0321 03:59:40.745276 4685 generic.go:334] "Generic (PLEG): container finished" podID="8cbeffba-f99c-488d-b0db-1cf3b8e31823" containerID="23942b5a236b157c75ea97f3ab55ff2d6a929faba98951a9faae645711b908c4" exitCode=0 Mar 21 03:59:40 crc kubenswrapper[4685]: I0321 03:59:40.745326 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rf7bs" event={"ID":"8cbeffba-f99c-488d-b0db-1cf3b8e31823","Type":"ContainerDied","Data":"23942b5a236b157c75ea97f3ab55ff2d6a929faba98951a9faae645711b908c4"} Mar 21 03:59:41 crc kubenswrapper[4685]: I0321 03:59:41.995594 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rf7bs" Mar 21 03:59:42 crc kubenswrapper[4685]: I0321 03:59:42.191493 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pl8nh\" (UniqueName: \"kubernetes.io/projected/8cbeffba-f99c-488d-b0db-1cf3b8e31823-kube-api-access-pl8nh\") pod \"8cbeffba-f99c-488d-b0db-1cf3b8e31823\" (UID: \"8cbeffba-f99c-488d-b0db-1cf3b8e31823\") " Mar 21 03:59:42 crc kubenswrapper[4685]: I0321 03:59:42.191596 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8cbeffba-f99c-488d-b0db-1cf3b8e31823-util\") pod \"8cbeffba-f99c-488d-b0db-1cf3b8e31823\" (UID: \"8cbeffba-f99c-488d-b0db-1cf3b8e31823\") " Mar 21 03:59:42 crc kubenswrapper[4685]: I0321 03:59:42.191628 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8cbeffba-f99c-488d-b0db-1cf3b8e31823-bundle\") pod \"8cbeffba-f99c-488d-b0db-1cf3b8e31823\" (UID: \"8cbeffba-f99c-488d-b0db-1cf3b8e31823\") " Mar 21 03:59:42 crc kubenswrapper[4685]: I0321 03:59:42.192761 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8cbeffba-f99c-488d-b0db-1cf3b8e31823-bundle" (OuterVolumeSpecName: "bundle") pod "8cbeffba-f99c-488d-b0db-1cf3b8e31823" (UID: "8cbeffba-f99c-488d-b0db-1cf3b8e31823"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 03:59:42 crc kubenswrapper[4685]: I0321 03:59:42.197042 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cbeffba-f99c-488d-b0db-1cf3b8e31823-kube-api-access-pl8nh" (OuterVolumeSpecName: "kube-api-access-pl8nh") pod "8cbeffba-f99c-488d-b0db-1cf3b8e31823" (UID: "8cbeffba-f99c-488d-b0db-1cf3b8e31823"). InnerVolumeSpecName "kube-api-access-pl8nh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 03:59:42 crc kubenswrapper[4685]: I0321 03:59:42.293072 4685 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8cbeffba-f99c-488d-b0db-1cf3b8e31823-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 03:59:42 crc kubenswrapper[4685]: I0321 03:59:42.293109 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pl8nh\" (UniqueName: \"kubernetes.io/projected/8cbeffba-f99c-488d-b0db-1cf3b8e31823-kube-api-access-pl8nh\") on node \"crc\" DevicePath \"\"" Mar 21 03:59:42 crc kubenswrapper[4685]: I0321 03:59:42.375047 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8cbeffba-f99c-488d-b0db-1cf3b8e31823-util" (OuterVolumeSpecName: "util") pod "8cbeffba-f99c-488d-b0db-1cf3b8e31823" (UID: "8cbeffba-f99c-488d-b0db-1cf3b8e31823"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 03:59:42 crc kubenswrapper[4685]: I0321 03:59:42.394256 4685 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8cbeffba-f99c-488d-b0db-1cf3b8e31823-util\") on node \"crc\" DevicePath \"\"" Mar 21 03:59:42 crc kubenswrapper[4685]: I0321 03:59:42.757193 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rf7bs" event={"ID":"8cbeffba-f99c-488d-b0db-1cf3b8e31823","Type":"ContainerDied","Data":"45c2796d24be366aea6baddbe6b145ed879dfabf9537e2bf399b1237c8d31bd0"} Mar 21 03:59:42 crc kubenswrapper[4685]: I0321 03:59:42.757228 4685 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="45c2796d24be366aea6baddbe6b145ed879dfabf9537e2bf399b1237c8d31bd0" Mar 21 03:59:42 crc kubenswrapper[4685]: I0321 03:59:42.757295 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rf7bs" Mar 21 03:59:54 crc kubenswrapper[4685]: I0321 03:59:54.497647 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-9b7d5d78b-jx8nv"] Mar 21 03:59:54 crc kubenswrapper[4685]: E0321 03:59:54.499174 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cbeffba-f99c-488d-b0db-1cf3b8e31823" containerName="util" Mar 21 03:59:54 crc kubenswrapper[4685]: I0321 03:59:54.499241 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cbeffba-f99c-488d-b0db-1cf3b8e31823" containerName="util" Mar 21 03:59:54 crc kubenswrapper[4685]: E0321 03:59:54.499295 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cbeffba-f99c-488d-b0db-1cf3b8e31823" containerName="extract" Mar 21 03:59:54 crc kubenswrapper[4685]: I0321 03:59:54.499348 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cbeffba-f99c-488d-b0db-1cf3b8e31823" containerName="extract" Mar 21 03:59:54 crc kubenswrapper[4685]: E0321 03:59:54.499397 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cbeffba-f99c-488d-b0db-1cf3b8e31823" containerName="pull" Mar 21 03:59:54 crc kubenswrapper[4685]: I0321 03:59:54.499446 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cbeffba-f99c-488d-b0db-1cf3b8e31823" containerName="pull" Mar 21 03:59:54 crc kubenswrapper[4685]: I0321 03:59:54.499619 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cbeffba-f99c-488d-b0db-1cf3b8e31823" containerName="extract" Mar 21 03:59:54 crc kubenswrapper[4685]: I0321 03:59:54.500045 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-9b7d5d78b-jx8nv" Mar 21 03:59:54 crc kubenswrapper[4685]: I0321 03:59:54.502714 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 21 03:59:54 crc kubenswrapper[4685]: I0321 03:59:54.503022 4685 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-5gp2c" Mar 21 03:59:54 crc kubenswrapper[4685]: I0321 03:59:54.505249 4685 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 21 03:59:54 crc kubenswrapper[4685]: I0321 03:59:54.506079 4685 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 21 03:59:54 crc kubenswrapper[4685]: I0321 03:59:54.506671 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 21 03:59:54 crc kubenswrapper[4685]: I0321 03:59:54.516425 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-9b7d5d78b-jx8nv"] Mar 21 03:59:54 crc kubenswrapper[4685]: I0321 03:59:54.545751 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ed0eadeb-865c-4742-b429-5f8e0bd67f2b-webhook-cert\") pod \"metallb-operator-controller-manager-9b7d5d78b-jx8nv\" (UID: \"ed0eadeb-865c-4742-b429-5f8e0bd67f2b\") " pod="metallb-system/metallb-operator-controller-manager-9b7d5d78b-jx8nv" Mar 21 03:59:54 crc kubenswrapper[4685]: I0321 03:59:54.545907 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cc95k\" (UniqueName: \"kubernetes.io/projected/ed0eadeb-865c-4742-b429-5f8e0bd67f2b-kube-api-access-cc95k\") pod \"metallb-operator-controller-manager-9b7d5d78b-jx8nv\" (UID: \"ed0eadeb-865c-4742-b429-5f8e0bd67f2b\") " pod="metallb-system/metallb-operator-controller-manager-9b7d5d78b-jx8nv" Mar 21 03:59:54 crc kubenswrapper[4685]: I0321 03:59:54.545965 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ed0eadeb-865c-4742-b429-5f8e0bd67f2b-apiservice-cert\") pod \"metallb-operator-controller-manager-9b7d5d78b-jx8nv\" (UID: \"ed0eadeb-865c-4742-b429-5f8e0bd67f2b\") " pod="metallb-system/metallb-operator-controller-manager-9b7d5d78b-jx8nv" Mar 21 03:59:54 crc kubenswrapper[4685]: I0321 03:59:54.646617 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cc95k\" (UniqueName: \"kubernetes.io/projected/ed0eadeb-865c-4742-b429-5f8e0bd67f2b-kube-api-access-cc95k\") pod \"metallb-operator-controller-manager-9b7d5d78b-jx8nv\" (UID: \"ed0eadeb-865c-4742-b429-5f8e0bd67f2b\") " pod="metallb-system/metallb-operator-controller-manager-9b7d5d78b-jx8nv" Mar 21 03:59:54 crc kubenswrapper[4685]: I0321 03:59:54.646751 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ed0eadeb-865c-4742-b429-5f8e0bd67f2b-apiservice-cert\") pod \"metallb-operator-controller-manager-9b7d5d78b-jx8nv\" (UID: \"ed0eadeb-865c-4742-b429-5f8e0bd67f2b\") " pod="metallb-system/metallb-operator-controller-manager-9b7d5d78b-jx8nv" Mar 21 03:59:54 crc kubenswrapper[4685]: I0321 03:59:54.646795 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ed0eadeb-865c-4742-b429-5f8e0bd67f2b-webhook-cert\") pod \"metallb-operator-controller-manager-9b7d5d78b-jx8nv\" (UID: \"ed0eadeb-865c-4742-b429-5f8e0bd67f2b\") " pod="metallb-system/metallb-operator-controller-manager-9b7d5d78b-jx8nv" Mar 21 03:59:54 crc kubenswrapper[4685]: I0321 03:59:54.652767 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ed0eadeb-865c-4742-b429-5f8e0bd67f2b-webhook-cert\") pod \"metallb-operator-controller-manager-9b7d5d78b-jx8nv\" (UID: \"ed0eadeb-865c-4742-b429-5f8e0bd67f2b\") " pod="metallb-system/metallb-operator-controller-manager-9b7d5d78b-jx8nv" Mar 21 03:59:54 crc kubenswrapper[4685]: I0321 03:59:54.653620 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ed0eadeb-865c-4742-b429-5f8e0bd67f2b-apiservice-cert\") pod \"metallb-operator-controller-manager-9b7d5d78b-jx8nv\" (UID: \"ed0eadeb-865c-4742-b429-5f8e0bd67f2b\") " pod="metallb-system/metallb-operator-controller-manager-9b7d5d78b-jx8nv" Mar 21 03:59:54 crc kubenswrapper[4685]: I0321 03:59:54.664198 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cc95k\" (UniqueName: \"kubernetes.io/projected/ed0eadeb-865c-4742-b429-5f8e0bd67f2b-kube-api-access-cc95k\") pod \"metallb-operator-controller-manager-9b7d5d78b-jx8nv\" (UID: \"ed0eadeb-865c-4742-b429-5f8e0bd67f2b\") " pod="metallb-system/metallb-operator-controller-manager-9b7d5d78b-jx8nv" Mar 21 03:59:54 crc kubenswrapper[4685]: I0321 03:59:54.815657 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-9b7d5d78b-jx8nv" Mar 21 03:59:54 crc kubenswrapper[4685]: I0321 03:59:54.865291 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-654575f8df-qj9tz"] Mar 21 03:59:54 crc kubenswrapper[4685]: I0321 03:59:54.866222 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-654575f8df-qj9tz" Mar 21 03:59:54 crc kubenswrapper[4685]: I0321 03:59:54.870670 4685 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 21 03:59:54 crc kubenswrapper[4685]: I0321 03:59:54.870922 4685 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-z8zxh" Mar 21 03:59:54 crc kubenswrapper[4685]: I0321 03:59:54.871234 4685 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 21 03:59:54 crc kubenswrapper[4685]: I0321 03:59:54.884905 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-654575f8df-qj9tz"] Mar 21 03:59:54 crc kubenswrapper[4685]: I0321 03:59:54.949755 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f2976724-903c-43f4-b917-da8a483a2e9e-webhook-cert\") pod \"metallb-operator-webhook-server-654575f8df-qj9tz\" (UID: \"f2976724-903c-43f4-b917-da8a483a2e9e\") " pod="metallb-system/metallb-operator-webhook-server-654575f8df-qj9tz" Mar 21 03:59:54 crc kubenswrapper[4685]: I0321 03:59:54.949894 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltgg4\" (UniqueName: \"kubernetes.io/projected/f2976724-903c-43f4-b917-da8a483a2e9e-kube-api-access-ltgg4\") pod \"metallb-operator-webhook-server-654575f8df-qj9tz\" (UID: \"f2976724-903c-43f4-b917-da8a483a2e9e\") " pod="metallb-system/metallb-operator-webhook-server-654575f8df-qj9tz" Mar 21 03:59:54 crc kubenswrapper[4685]: I0321 03:59:54.950056 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f2976724-903c-43f4-b917-da8a483a2e9e-apiservice-cert\") pod \"metallb-operator-webhook-server-654575f8df-qj9tz\" (UID: \"f2976724-903c-43f4-b917-da8a483a2e9e\") " pod="metallb-system/metallb-operator-webhook-server-654575f8df-qj9tz" Mar 21 03:59:55 crc kubenswrapper[4685]: I0321 03:59:55.050773 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f2976724-903c-43f4-b917-da8a483a2e9e-apiservice-cert\") pod \"metallb-operator-webhook-server-654575f8df-qj9tz\" (UID: \"f2976724-903c-43f4-b917-da8a483a2e9e\") " pod="metallb-system/metallb-operator-webhook-server-654575f8df-qj9tz" Mar 21 03:59:55 crc kubenswrapper[4685]: I0321 03:59:55.050949 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f2976724-903c-43f4-b917-da8a483a2e9e-webhook-cert\") pod \"metallb-operator-webhook-server-654575f8df-qj9tz\" (UID: \"f2976724-903c-43f4-b917-da8a483a2e9e\") " pod="metallb-system/metallb-operator-webhook-server-654575f8df-qj9tz" Mar 21 03:59:55 crc kubenswrapper[4685]: I0321 03:59:55.051014 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltgg4\" (UniqueName: \"kubernetes.io/projected/f2976724-903c-43f4-b917-da8a483a2e9e-kube-api-access-ltgg4\") pod \"metallb-operator-webhook-server-654575f8df-qj9tz\" (UID: \"f2976724-903c-43f4-b917-da8a483a2e9e\") " pod="metallb-system/metallb-operator-webhook-server-654575f8df-qj9tz" Mar 21 03:59:55 crc kubenswrapper[4685]: I0321 03:59:55.053815 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-9b7d5d78b-jx8nv"] Mar 21 03:59:55 crc kubenswrapper[4685]: I0321 03:59:55.055804 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f2976724-903c-43f4-b917-da8a483a2e9e-webhook-cert\") pod \"metallb-operator-webhook-server-654575f8df-qj9tz\" (UID: \"f2976724-903c-43f4-b917-da8a483a2e9e\") " pod="metallb-system/metallb-operator-webhook-server-654575f8df-qj9tz" Mar 21 03:59:55 crc kubenswrapper[4685]: I0321 03:59:55.065222 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f2976724-903c-43f4-b917-da8a483a2e9e-apiservice-cert\") pod \"metallb-operator-webhook-server-654575f8df-qj9tz\" (UID: \"f2976724-903c-43f4-b917-da8a483a2e9e\") " pod="metallb-system/metallb-operator-webhook-server-654575f8df-qj9tz" Mar 21 03:59:55 crc kubenswrapper[4685]: I0321 03:59:55.076116 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltgg4\" (UniqueName: \"kubernetes.io/projected/f2976724-903c-43f4-b917-da8a483a2e9e-kube-api-access-ltgg4\") pod \"metallb-operator-webhook-server-654575f8df-qj9tz\" (UID: \"f2976724-903c-43f4-b917-da8a483a2e9e\") " pod="metallb-system/metallb-operator-webhook-server-654575f8df-qj9tz" Mar 21 03:59:55 crc kubenswrapper[4685]: I0321 03:59:55.187139 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-654575f8df-qj9tz" Mar 21 03:59:55 crc kubenswrapper[4685]: I0321 03:59:55.641601 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-654575f8df-qj9tz"] Mar 21 03:59:55 crc kubenswrapper[4685]: W0321 03:59:55.650703 4685 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf2976724_903c_43f4_b917_da8a483a2e9e.slice/crio-05f2c360a85de56a248e4cc6f48061f90c8c5348623f540fbe7f24727130d777 WatchSource:0}: Error finding container 05f2c360a85de56a248e4cc6f48061f90c8c5348623f540fbe7f24727130d777: Status 404 returned error can't find the container with id 05f2c360a85de56a248e4cc6f48061f90c8c5348623f540fbe7f24727130d777 Mar 21 03:59:55 crc kubenswrapper[4685]: I0321 03:59:55.822750 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-654575f8df-qj9tz" event={"ID":"f2976724-903c-43f4-b917-da8a483a2e9e","Type":"ContainerStarted","Data":"05f2c360a85de56a248e4cc6f48061f90c8c5348623f540fbe7f24727130d777"} Mar 21 03:59:55 crc kubenswrapper[4685]: I0321 03:59:55.823927 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-9b7d5d78b-jx8nv" event={"ID":"ed0eadeb-865c-4742-b429-5f8e0bd67f2b","Type":"ContainerStarted","Data":"4f6d6d481c6c434af8f2343676e73bae05766b60ec0a6d36381428b9e5143882"} Mar 21 03:59:58 crc kubenswrapper[4685]: I0321 03:59:58.841699 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-9b7d5d78b-jx8nv" event={"ID":"ed0eadeb-865c-4742-b429-5f8e0bd67f2b","Type":"ContainerStarted","Data":"32cdf703c8038e36560e0ae51cbbd363c5ea5b61b2a948f56c63b6dd01c2d6f3"} Mar 21 03:59:58 crc kubenswrapper[4685]: I0321 03:59:58.842166 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-9b7d5d78b-jx8nv" Mar 21 03:59:58 crc kubenswrapper[4685]: I0321 03:59:58.864551 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-9b7d5d78b-jx8nv" podStartSLOduration=1.63090861 podStartE2EDuration="4.864532427s" podCreationTimestamp="2026-03-21 03:59:54 +0000 UTC" firstStartedPulling="2026-03-21 03:59:55.08010148 +0000 UTC m=+827.557170272" lastFinishedPulling="2026-03-21 03:59:58.313725297 +0000 UTC m=+830.790794089" observedRunningTime="2026-03-21 03:59:58.861352499 +0000 UTC m=+831.338421291" watchObservedRunningTime="2026-03-21 03:59:58.864532427 +0000 UTC m=+831.341601209" Mar 21 04:00:00 crc kubenswrapper[4685]: I0321 04:00:00.130308 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567760-tf4jq"] Mar 21 04:00:00 crc kubenswrapper[4685]: I0321 04:00:00.131670 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567760-tf4jq" Mar 21 04:00:00 crc kubenswrapper[4685]: I0321 04:00:00.133496 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 21 04:00:00 crc kubenswrapper[4685]: I0321 04:00:00.133719 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 21 04:00:00 crc kubenswrapper[4685]: I0321 04:00:00.134426 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567760-xpknn"] Mar 21 04:00:00 crc kubenswrapper[4685]: I0321 04:00:00.135092 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567760-xpknn" Mar 21 04:00:00 crc kubenswrapper[4685]: I0321 04:00:00.139130 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 04:00:00 crc kubenswrapper[4685]: I0321 04:00:00.139296 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k75cc" Mar 21 04:00:00 crc kubenswrapper[4685]: I0321 04:00:00.139518 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 04:00:00 crc kubenswrapper[4685]: I0321 04:00:00.142724 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567760-xpknn"] Mar 21 04:00:00 crc kubenswrapper[4685]: I0321 04:00:00.148766 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567760-tf4jq"] Mar 21 04:00:00 crc kubenswrapper[4685]: I0321 04:00:00.328926 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8jh7\" (UniqueName: \"kubernetes.io/projected/5f3ae43e-162a-41fc-85f0-92106386bea7-kube-api-access-m8jh7\") pod \"auto-csr-approver-29567760-xpknn\" (UID: \"5f3ae43e-162a-41fc-85f0-92106386bea7\") " pod="openshift-infra/auto-csr-approver-29567760-xpknn" Mar 21 04:00:00 crc kubenswrapper[4685]: I0321 04:00:00.329323 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6e4d18a8-c7d2-4c54-a32e-ee3743437bb0-config-volume\") pod \"collect-profiles-29567760-tf4jq\" (UID: \"6e4d18a8-c7d2-4c54-a32e-ee3743437bb0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567760-tf4jq" Mar 21 04:00:00 crc kubenswrapper[4685]: I0321 04:00:00.329412 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6e4d18a8-c7d2-4c54-a32e-ee3743437bb0-secret-volume\") pod \"collect-profiles-29567760-tf4jq\" (UID: \"6e4d18a8-c7d2-4c54-a32e-ee3743437bb0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567760-tf4jq" Mar 21 04:00:00 crc kubenswrapper[4685]: I0321 04:00:00.329647 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vd6dk\" (UniqueName: \"kubernetes.io/projected/6e4d18a8-c7d2-4c54-a32e-ee3743437bb0-kube-api-access-vd6dk\") pod \"collect-profiles-29567760-tf4jq\" (UID: \"6e4d18a8-c7d2-4c54-a32e-ee3743437bb0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567760-tf4jq" Mar 21 04:00:00 crc kubenswrapper[4685]: I0321 04:00:00.430896 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6e4d18a8-c7d2-4c54-a32e-ee3743437bb0-secret-volume\") pod \"collect-profiles-29567760-tf4jq\" (UID: \"6e4d18a8-c7d2-4c54-a32e-ee3743437bb0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567760-tf4jq" Mar 21 04:00:00 crc kubenswrapper[4685]: I0321 04:00:00.431194 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vd6dk\" (UniqueName: \"kubernetes.io/projected/6e4d18a8-c7d2-4c54-a32e-ee3743437bb0-kube-api-access-vd6dk\") pod \"collect-profiles-29567760-tf4jq\" (UID: \"6e4d18a8-c7d2-4c54-a32e-ee3743437bb0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567760-tf4jq" Mar 21 04:00:00 crc kubenswrapper[4685]: I0321 04:00:00.431335 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8jh7\" (UniqueName: \"kubernetes.io/projected/5f3ae43e-162a-41fc-85f0-92106386bea7-kube-api-access-m8jh7\") pod \"auto-csr-approver-29567760-xpknn\" (UID: \"5f3ae43e-162a-41fc-85f0-92106386bea7\") " pod="openshift-infra/auto-csr-approver-29567760-xpknn" Mar 21 04:00:00 crc kubenswrapper[4685]: I0321 04:00:00.431755 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6e4d18a8-c7d2-4c54-a32e-ee3743437bb0-config-volume\") pod \"collect-profiles-29567760-tf4jq\" (UID: \"6e4d18a8-c7d2-4c54-a32e-ee3743437bb0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567760-tf4jq" Mar 21 04:00:00 crc kubenswrapper[4685]: I0321 04:00:00.432490 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6e4d18a8-c7d2-4c54-a32e-ee3743437bb0-config-volume\") pod \"collect-profiles-29567760-tf4jq\" (UID: \"6e4d18a8-c7d2-4c54-a32e-ee3743437bb0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567760-tf4jq" Mar 21 04:00:00 crc kubenswrapper[4685]: I0321 04:00:00.444213 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6e4d18a8-c7d2-4c54-a32e-ee3743437bb0-secret-volume\") pod \"collect-profiles-29567760-tf4jq\" (UID: \"6e4d18a8-c7d2-4c54-a32e-ee3743437bb0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567760-tf4jq" Mar 21 04:00:00 crc kubenswrapper[4685]: I0321 04:00:00.446018 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8jh7\" (UniqueName: \"kubernetes.io/projected/5f3ae43e-162a-41fc-85f0-92106386bea7-kube-api-access-m8jh7\") pod \"auto-csr-approver-29567760-xpknn\" (UID: \"5f3ae43e-162a-41fc-85f0-92106386bea7\") " pod="openshift-infra/auto-csr-approver-29567760-xpknn" Mar 21 04:00:00 crc kubenswrapper[4685]: I0321 04:00:00.446938 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vd6dk\" (UniqueName: \"kubernetes.io/projected/6e4d18a8-c7d2-4c54-a32e-ee3743437bb0-kube-api-access-vd6dk\") pod \"collect-profiles-29567760-tf4jq\" (UID: \"6e4d18a8-c7d2-4c54-a32e-ee3743437bb0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567760-tf4jq" Mar 21 04:00:00 crc kubenswrapper[4685]: I0321 04:00:00.452718 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567760-tf4jq" Mar 21 04:00:00 crc kubenswrapper[4685]: I0321 04:00:00.459280 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567760-xpknn" Mar 21 04:00:00 crc kubenswrapper[4685]: I0321 04:00:00.629137 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567760-tf4jq"] Mar 21 04:00:00 crc kubenswrapper[4685]: W0321 04:00:00.640685 4685 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e4d18a8_c7d2_4c54_a32e_ee3743437bb0.slice/crio-6ebb4c4103cbe9a078d25f02c9ce0d0e2a2f0db3aaa84886c97a99a568aaaf83 WatchSource:0}: Error finding container 6ebb4c4103cbe9a078d25f02c9ce0d0e2a2f0db3aaa84886c97a99a568aaaf83: Status 404 returned error can't find the container with id 6ebb4c4103cbe9a078d25f02c9ce0d0e2a2f0db3aaa84886c97a99a568aaaf83 Mar 21 04:00:00 crc kubenswrapper[4685]: I0321 04:00:00.689224 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567760-xpknn"] Mar 21 04:00:00 crc kubenswrapper[4685]: W0321 04:00:00.710411 4685 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f3ae43e_162a_41fc_85f0_92106386bea7.slice/crio-d120b7bfae0a11e9a8d679d06a1a51dd3b35f04ddcba8f89631e68a1bf524155 WatchSource:0}: Error finding container d120b7bfae0a11e9a8d679d06a1a51dd3b35f04ddcba8f89631e68a1bf524155: Status 404 returned error can't find the container with id d120b7bfae0a11e9a8d679d06a1a51dd3b35f04ddcba8f89631e68a1bf524155 Mar 21 04:00:00 crc kubenswrapper[4685]: I0321 04:00:00.855489 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567760-xpknn" event={"ID":"5f3ae43e-162a-41fc-85f0-92106386bea7","Type":"ContainerStarted","Data":"d120b7bfae0a11e9a8d679d06a1a51dd3b35f04ddcba8f89631e68a1bf524155"} Mar 21 04:00:00 crc kubenswrapper[4685]: I0321 04:00:00.857004 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-654575f8df-qj9tz" event={"ID":"f2976724-903c-43f4-b917-da8a483a2e9e","Type":"ContainerStarted","Data":"d02a0e917f315a1ebef755c6f92b85d829660f67bcf8c6e78f04285d85a17b92"} Mar 21 04:00:00 crc kubenswrapper[4685]: I0321 04:00:00.857097 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-654575f8df-qj9tz" Mar 21 04:00:00 crc kubenswrapper[4685]: I0321 04:00:00.857887 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567760-tf4jq" event={"ID":"6e4d18a8-c7d2-4c54-a32e-ee3743437bb0","Type":"ContainerStarted","Data":"6ebb4c4103cbe9a078d25f02c9ce0d0e2a2f0db3aaa84886c97a99a568aaaf83"} Mar 21 04:00:00 crc kubenswrapper[4685]: I0321 04:00:00.873539 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-654575f8df-qj9tz" podStartSLOduration=2.494807707 podStartE2EDuration="6.873522589s" podCreationTimestamp="2026-03-21 03:59:54 +0000 UTC" firstStartedPulling="2026-03-21 03:59:55.654398957 +0000 UTC m=+828.131467749" lastFinishedPulling="2026-03-21 04:00:00.033113839 +0000 UTC m=+832.510182631" observedRunningTime="2026-03-21 04:00:00.872184162 +0000 UTC m=+833.349252954" watchObservedRunningTime="2026-03-21 04:00:00.873522589 +0000 UTC m=+833.350591381" Mar 21 04:00:01 crc kubenswrapper[4685]: I0321 04:00:01.865480 4685 generic.go:334] "Generic (PLEG): container finished" podID="6e4d18a8-c7d2-4c54-a32e-ee3743437bb0" containerID="621249fa53ef4eebe5bc96cfce0f9cb10a727056dadd772b85eff935c72d8ef7" exitCode=0 Mar 21 04:00:01 crc kubenswrapper[4685]: I0321 04:00:01.865580 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567760-tf4jq" event={"ID":"6e4d18a8-c7d2-4c54-a32e-ee3743437bb0","Type":"ContainerDied","Data":"621249fa53ef4eebe5bc96cfce0f9cb10a727056dadd772b85eff935c72d8ef7"} Mar 21 04:00:02 crc kubenswrapper[4685]: I0321 04:00:02.101050 4685 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 21 04:00:03 crc kubenswrapper[4685]: I0321 04:00:03.113211 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567760-tf4jq" Mar 21 04:00:03 crc kubenswrapper[4685]: I0321 04:00:03.265638 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6e4d18a8-c7d2-4c54-a32e-ee3743437bb0-config-volume\") pod \"6e4d18a8-c7d2-4c54-a32e-ee3743437bb0\" (UID: \"6e4d18a8-c7d2-4c54-a32e-ee3743437bb0\") " Mar 21 04:00:03 crc kubenswrapper[4685]: I0321 04:00:03.266028 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vd6dk\" (UniqueName: \"kubernetes.io/projected/6e4d18a8-c7d2-4c54-a32e-ee3743437bb0-kube-api-access-vd6dk\") pod \"6e4d18a8-c7d2-4c54-a32e-ee3743437bb0\" (UID: \"6e4d18a8-c7d2-4c54-a32e-ee3743437bb0\") " Mar 21 04:00:03 crc kubenswrapper[4685]: I0321 04:00:03.266055 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6e4d18a8-c7d2-4c54-a32e-ee3743437bb0-secret-volume\") pod \"6e4d18a8-c7d2-4c54-a32e-ee3743437bb0\" (UID: \"6e4d18a8-c7d2-4c54-a32e-ee3743437bb0\") " Mar 21 04:00:03 crc kubenswrapper[4685]: I0321 04:00:03.266495 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e4d18a8-c7d2-4c54-a32e-ee3743437bb0-config-volume" (OuterVolumeSpecName: "config-volume") pod "6e4d18a8-c7d2-4c54-a32e-ee3743437bb0" (UID: "6e4d18a8-c7d2-4c54-a32e-ee3743437bb0"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:00:03 crc kubenswrapper[4685]: I0321 04:00:03.271731 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e4d18a8-c7d2-4c54-a32e-ee3743437bb0-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "6e4d18a8-c7d2-4c54-a32e-ee3743437bb0" (UID: "6e4d18a8-c7d2-4c54-a32e-ee3743437bb0"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:00:03 crc kubenswrapper[4685]: I0321 04:00:03.281024 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e4d18a8-c7d2-4c54-a32e-ee3743437bb0-kube-api-access-vd6dk" (OuterVolumeSpecName: "kube-api-access-vd6dk") pod "6e4d18a8-c7d2-4c54-a32e-ee3743437bb0" (UID: "6e4d18a8-c7d2-4c54-a32e-ee3743437bb0"). InnerVolumeSpecName "kube-api-access-vd6dk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:00:03 crc kubenswrapper[4685]: I0321 04:00:03.367021 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vd6dk\" (UniqueName: \"kubernetes.io/projected/6e4d18a8-c7d2-4c54-a32e-ee3743437bb0-kube-api-access-vd6dk\") on node \"crc\" DevicePath \"\"" Mar 21 04:00:03 crc kubenswrapper[4685]: I0321 04:00:03.367058 4685 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6e4d18a8-c7d2-4c54-a32e-ee3743437bb0-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 21 04:00:03 crc kubenswrapper[4685]: I0321 04:00:03.367069 4685 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6e4d18a8-c7d2-4c54-a32e-ee3743437bb0-config-volume\") on node \"crc\" DevicePath \"\"" Mar 21 04:00:03 crc kubenswrapper[4685]: I0321 04:00:03.877767 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567760-xpknn" event={"ID":"5f3ae43e-162a-41fc-85f0-92106386bea7","Type":"ContainerStarted","Data":"df54b908af7408c6db30b39d212ea30f6e7d3f734d3dbd8e347006a39eef7666"} Mar 21 04:00:03 crc kubenswrapper[4685]: I0321 04:00:03.879157 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567760-tf4jq" event={"ID":"6e4d18a8-c7d2-4c54-a32e-ee3743437bb0","Type":"ContainerDied","Data":"6ebb4c4103cbe9a078d25f02c9ce0d0e2a2f0db3aaa84886c97a99a568aaaf83"} Mar 21 04:00:03 crc kubenswrapper[4685]: I0321 04:00:03.879209 4685 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ebb4c4103cbe9a078d25f02c9ce0d0e2a2f0db3aaa84886c97a99a568aaaf83" Mar 21 04:00:03 crc kubenswrapper[4685]: I0321 04:00:03.879256 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567760-tf4jq" Mar 21 04:00:03 crc kubenswrapper[4685]: I0321 04:00:03.906472 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567760-xpknn" podStartSLOduration=1.149519817 podStartE2EDuration="3.906453734s" podCreationTimestamp="2026-03-21 04:00:00 +0000 UTC" firstStartedPulling="2026-03-21 04:00:00.714229446 +0000 UTC m=+833.191298238" lastFinishedPulling="2026-03-21 04:00:03.471163363 +0000 UTC m=+835.948232155" observedRunningTime="2026-03-21 04:00:03.90596289 +0000 UTC m=+836.383031692" watchObservedRunningTime="2026-03-21 04:00:03.906453734 +0000 UTC m=+836.383522526" Mar 21 04:00:04 crc kubenswrapper[4685]: I0321 04:00:04.886877 4685 generic.go:334] "Generic (PLEG): container finished" podID="5f3ae43e-162a-41fc-85f0-92106386bea7" containerID="df54b908af7408c6db30b39d212ea30f6e7d3f734d3dbd8e347006a39eef7666" exitCode=0 Mar 21 04:00:04 crc kubenswrapper[4685]: I0321 04:00:04.886924 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567760-xpknn" event={"ID":"5f3ae43e-162a-41fc-85f0-92106386bea7","Type":"ContainerDied","Data":"df54b908af7408c6db30b39d212ea30f6e7d3f734d3dbd8e347006a39eef7666"} Mar 21 04:00:06 crc kubenswrapper[4685]: I0321 04:00:06.107886 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567760-xpknn" Mar 21 04:00:06 crc kubenswrapper[4685]: I0321 04:00:06.300354 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m8jh7\" (UniqueName: \"kubernetes.io/projected/5f3ae43e-162a-41fc-85f0-92106386bea7-kube-api-access-m8jh7\") pod \"5f3ae43e-162a-41fc-85f0-92106386bea7\" (UID: \"5f3ae43e-162a-41fc-85f0-92106386bea7\") " Mar 21 04:00:06 crc kubenswrapper[4685]: I0321 04:00:06.305438 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f3ae43e-162a-41fc-85f0-92106386bea7-kube-api-access-m8jh7" (OuterVolumeSpecName: "kube-api-access-m8jh7") pod "5f3ae43e-162a-41fc-85f0-92106386bea7" (UID: "5f3ae43e-162a-41fc-85f0-92106386bea7"). InnerVolumeSpecName "kube-api-access-m8jh7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:00:06 crc kubenswrapper[4685]: I0321 04:00:06.401790 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m8jh7\" (UniqueName: \"kubernetes.io/projected/5f3ae43e-162a-41fc-85f0-92106386bea7-kube-api-access-m8jh7\") on node \"crc\" DevicePath \"\"" Mar 21 04:00:06 crc kubenswrapper[4685]: I0321 04:00:06.899125 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567760-xpknn" event={"ID":"5f3ae43e-162a-41fc-85f0-92106386bea7","Type":"ContainerDied","Data":"d120b7bfae0a11e9a8d679d06a1a51dd3b35f04ddcba8f89631e68a1bf524155"} Mar 21 04:00:06 crc kubenswrapper[4685]: I0321 04:00:06.899166 4685 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d120b7bfae0a11e9a8d679d06a1a51dd3b35f04ddcba8f89631e68a1bf524155" Mar 21 04:00:06 crc kubenswrapper[4685]: I0321 04:00:06.899187 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567760-xpknn" Mar 21 04:00:07 crc kubenswrapper[4685]: I0321 04:00:07.162882 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567754-5w6w5"] Mar 21 04:00:07 crc kubenswrapper[4685]: I0321 04:00:07.165923 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567754-5w6w5"] Mar 21 04:00:08 crc kubenswrapper[4685]: I0321 04:00:08.313190 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1ef6e0e-74c2-4e2b-bcfa-70d821d09201" path="/var/lib/kubelet/pods/e1ef6e0e-74c2-4e2b-bcfa-70d821d09201/volumes" Mar 21 04:00:09 crc kubenswrapper[4685]: I0321 04:00:09.685134 4685 patch_prober.go:28] interesting pod/machine-config-daemon-7r9cg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 04:00:09 crc kubenswrapper[4685]: I0321 04:00:09.685451 4685 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" podUID="cea46fe2-4e41-43ab-a069-cb30fb4e732c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 04:00:09 crc kubenswrapper[4685]: I0321 04:00:09.685513 4685 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" Mar 21 04:00:09 crc kubenswrapper[4685]: I0321 04:00:09.686170 4685 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"da8be442c3ea2f96e685bee081e96f02736707ffa414186cd8dedbc178b8c1c5"} pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 21 04:00:09 crc kubenswrapper[4685]: I0321 04:00:09.686240 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" podUID="cea46fe2-4e41-43ab-a069-cb30fb4e732c" containerName="machine-config-daemon" containerID="cri-o://da8be442c3ea2f96e685bee081e96f02736707ffa414186cd8dedbc178b8c1c5" gracePeriod=600 Mar 21 04:00:09 crc kubenswrapper[4685]: I0321 04:00:09.914164 4685 generic.go:334] "Generic (PLEG): container finished" podID="cea46fe2-4e41-43ab-a069-cb30fb4e732c" containerID="da8be442c3ea2f96e685bee081e96f02736707ffa414186cd8dedbc178b8c1c5" exitCode=0 Mar 21 04:00:09 crc kubenswrapper[4685]: I0321 04:00:09.914217 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" event={"ID":"cea46fe2-4e41-43ab-a069-cb30fb4e732c","Type":"ContainerDied","Data":"da8be442c3ea2f96e685bee081e96f02736707ffa414186cd8dedbc178b8c1c5"} Mar 21 04:00:09 crc kubenswrapper[4685]: I0321 04:00:09.914267 4685 scope.go:117] "RemoveContainer" containerID="51700df58050c3bd486b7492e271833d0dee5610ed2bdc61e612672321528c6c" Mar 21 04:00:10 crc kubenswrapper[4685]: I0321 04:00:10.920659 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" event={"ID":"cea46fe2-4e41-43ab-a069-cb30fb4e732c","Type":"ContainerStarted","Data":"ac4ffd676ad57605265aed5caa44cae8130cfde3468685b94b3265e3fc4a39a0"} Mar 21 04:00:15 crc kubenswrapper[4685]: I0321 04:00:15.192651 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-654575f8df-qj9tz" Mar 21 04:00:27 crc kubenswrapper[4685]: I0321 04:00:27.966862 4685 scope.go:117] "RemoveContainer" containerID="414be56549cd6b11efa98ee004718f676f58a724a219aea231fbba058e444aaa" Mar 21 04:00:34 crc kubenswrapper[4685]: I0321 04:00:34.817951 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-9b7d5d78b-jx8nv" Mar 21 04:00:35 crc kubenswrapper[4685]: I0321 04:00:35.579326 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-tsjrs"] Mar 21 04:00:35 crc kubenswrapper[4685]: E0321 04:00:35.580046 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f3ae43e-162a-41fc-85f0-92106386bea7" containerName="oc" Mar 21 04:00:35 crc kubenswrapper[4685]: I0321 04:00:35.580079 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f3ae43e-162a-41fc-85f0-92106386bea7" containerName="oc" Mar 21 04:00:35 crc kubenswrapper[4685]: E0321 04:00:35.580127 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e4d18a8-c7d2-4c54-a32e-ee3743437bb0" containerName="collect-profiles" Mar 21 04:00:35 crc kubenswrapper[4685]: I0321 04:00:35.580140 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e4d18a8-c7d2-4c54-a32e-ee3743437bb0" containerName="collect-profiles" Mar 21 04:00:35 crc kubenswrapper[4685]: I0321 04:00:35.580305 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e4d18a8-c7d2-4c54-a32e-ee3743437bb0" containerName="collect-profiles" Mar 21 04:00:35 crc kubenswrapper[4685]: I0321 04:00:35.580342 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f3ae43e-162a-41fc-85f0-92106386bea7" containerName="oc" Mar 21 04:00:35 crc kubenswrapper[4685]: I0321 04:00:35.583377 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-tsjrs" Mar 21 04:00:35 crc kubenswrapper[4685]: I0321 04:00:35.584971 4685 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 21 04:00:35 crc kubenswrapper[4685]: I0321 04:00:35.586027 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 21 04:00:35 crc kubenswrapper[4685]: I0321 04:00:35.586315 4685 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-5x5r7" Mar 21 04:00:35 crc kubenswrapper[4685]: I0321 04:00:35.595738 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-rjxbv"] Mar 21 04:00:35 crc kubenswrapper[4685]: I0321 04:00:35.611471 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-rjxbv" Mar 21 04:00:35 crc kubenswrapper[4685]: I0321 04:00:35.614744 4685 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 21 04:00:35 crc kubenswrapper[4685]: I0321 04:00:35.621531 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-rjxbv"] Mar 21 04:00:35 crc kubenswrapper[4685]: I0321 04:00:35.671717 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-cd4d4"] Mar 21 04:00:35 crc kubenswrapper[4685]: I0321 04:00:35.672603 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-cd4d4" Mar 21 04:00:35 crc kubenswrapper[4685]: I0321 04:00:35.675051 4685 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 21 04:00:35 crc kubenswrapper[4685]: I0321 04:00:35.675678 4685 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-wqz6l" Mar 21 04:00:35 crc kubenswrapper[4685]: I0321 04:00:35.675893 4685 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 21 04:00:35 crc kubenswrapper[4685]: I0321 04:00:35.676066 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 21 04:00:35 crc kubenswrapper[4685]: I0321 04:00:35.684040 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-7bb4cc7c98-w4n8h"] Mar 21 04:00:35 crc kubenswrapper[4685]: I0321 04:00:35.685162 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-w4n8h" Mar 21 04:00:35 crc kubenswrapper[4685]: I0321 04:00:35.690629 4685 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 21 04:00:35 crc kubenswrapper[4685]: I0321 04:00:35.699899 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-w4n8h"] Mar 21 04:00:35 crc kubenswrapper[4685]: I0321 04:00:35.762138 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/7e226197-b1ce-49a2-a3b9-5aed3d774a12-frr-conf\") pod \"frr-k8s-tsjrs\" (UID: \"7e226197-b1ce-49a2-a3b9-5aed3d774a12\") " pod="metallb-system/frr-k8s-tsjrs" Mar 21 04:00:35 crc kubenswrapper[4685]: I0321 04:00:35.762179 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdbzv\" (UniqueName: \"kubernetes.io/projected/545a6f92-59ae-4ffb-824d-e493044c0082-kube-api-access-bdbzv\") pod \"frr-k8s-webhook-server-bcc4b6f68-rjxbv\" (UID: \"545a6f92-59ae-4ffb-824d-e493044c0082\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-rjxbv" Mar 21 04:00:35 crc kubenswrapper[4685]: I0321 04:00:35.762200 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/7e226197-b1ce-49a2-a3b9-5aed3d774a12-frr-startup\") pod \"frr-k8s-tsjrs\" (UID: \"7e226197-b1ce-49a2-a3b9-5aed3d774a12\") " pod="metallb-system/frr-k8s-tsjrs" Mar 21 04:00:35 crc kubenswrapper[4685]: I0321 04:00:35.762249 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/7e226197-b1ce-49a2-a3b9-5aed3d774a12-frr-sockets\") pod \"frr-k8s-tsjrs\" (UID: \"7e226197-b1ce-49a2-a3b9-5aed3d774a12\") " pod="metallb-system/frr-k8s-tsjrs" Mar 21 04:00:35 crc kubenswrapper[4685]: I0321 04:00:35.762286 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/76cca999-b151-46c4-b61b-b6249d75e2f5-cert\") pod \"controller-7bb4cc7c98-w4n8h\" (UID: \"76cca999-b151-46c4-b61b-b6249d75e2f5\") " pod="metallb-system/controller-7bb4cc7c98-w4n8h" Mar 21 04:00:35 crc kubenswrapper[4685]: I0321 04:00:35.762344 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7e226197-b1ce-49a2-a3b9-5aed3d774a12-metrics-certs\") pod \"frr-k8s-tsjrs\" (UID: \"7e226197-b1ce-49a2-a3b9-5aed3d774a12\") " pod="metallb-system/frr-k8s-tsjrs" Mar 21 04:00:35 crc kubenswrapper[4685]: I0321 04:00:35.762388 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/76cca999-b151-46c4-b61b-b6249d75e2f5-metrics-certs\") pod \"controller-7bb4cc7c98-w4n8h\" (UID: \"76cca999-b151-46c4-b61b-b6249d75e2f5\") " pod="metallb-system/controller-7bb4cc7c98-w4n8h" Mar 21 04:00:35 crc kubenswrapper[4685]: I0321 04:00:35.762407 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/dc9a0be8-5b8c-43d8-a670-06541535d7a0-memberlist\") pod \"speaker-cd4d4\" (UID: \"dc9a0be8-5b8c-43d8-a670-06541535d7a0\") " pod="metallb-system/speaker-cd4d4" Mar 21 04:00:35 crc kubenswrapper[4685]: I0321 04:00:35.762424 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkqfm\" (UniqueName: \"kubernetes.io/projected/76cca999-b151-46c4-b61b-b6249d75e2f5-kube-api-access-wkqfm\") pod \"controller-7bb4cc7c98-w4n8h\" (UID: \"76cca999-b151-46c4-b61b-b6249d75e2f5\") " pod="metallb-system/controller-7bb4cc7c98-w4n8h" Mar 21 04:00:35 crc kubenswrapper[4685]: I0321 04:00:35.762486 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dc9a0be8-5b8c-43d8-a670-06541535d7a0-metrics-certs\") pod \"speaker-cd4d4\" (UID: \"dc9a0be8-5b8c-43d8-a670-06541535d7a0\") " pod="metallb-system/speaker-cd4d4" Mar 21 04:00:35 crc kubenswrapper[4685]: I0321 04:00:35.762510 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drm2w\" (UniqueName: \"kubernetes.io/projected/7e226197-b1ce-49a2-a3b9-5aed3d774a12-kube-api-access-drm2w\") pod \"frr-k8s-tsjrs\" (UID: \"7e226197-b1ce-49a2-a3b9-5aed3d774a12\") " pod="metallb-system/frr-k8s-tsjrs" Mar 21 04:00:35 crc kubenswrapper[4685]: I0321 04:00:35.762528 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/dc9a0be8-5b8c-43d8-a670-06541535d7a0-metallb-excludel2\") pod \"speaker-cd4d4\" (UID: \"dc9a0be8-5b8c-43d8-a670-06541535d7a0\") " pod="metallb-system/speaker-cd4d4" Mar 21 04:00:35 crc kubenswrapper[4685]: I0321 04:00:35.762563 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/545a6f92-59ae-4ffb-824d-e493044c0082-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-rjxbv\" (UID: \"545a6f92-59ae-4ffb-824d-e493044c0082\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-rjxbv" Mar 21 04:00:35 crc kubenswrapper[4685]: I0321 04:00:35.762590 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhlbf\" (UniqueName: \"kubernetes.io/projected/dc9a0be8-5b8c-43d8-a670-06541535d7a0-kube-api-access-lhlbf\") pod \"speaker-cd4d4\" (UID: \"dc9a0be8-5b8c-43d8-a670-06541535d7a0\") " pod="metallb-system/speaker-cd4d4" Mar 21 04:00:35 crc kubenswrapper[4685]: I0321 04:00:35.762606 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/7e226197-b1ce-49a2-a3b9-5aed3d774a12-reloader\") pod \"frr-k8s-tsjrs\" (UID: \"7e226197-b1ce-49a2-a3b9-5aed3d774a12\") " pod="metallb-system/frr-k8s-tsjrs" Mar 21 04:00:35 crc kubenswrapper[4685]: I0321 04:00:35.762621 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/7e226197-b1ce-49a2-a3b9-5aed3d774a12-metrics\") pod \"frr-k8s-tsjrs\" (UID: \"7e226197-b1ce-49a2-a3b9-5aed3d774a12\") " pod="metallb-system/frr-k8s-tsjrs" Mar 21 04:00:35 crc kubenswrapper[4685]: I0321 04:00:35.863423 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhlbf\" (UniqueName: \"kubernetes.io/projected/dc9a0be8-5b8c-43d8-a670-06541535d7a0-kube-api-access-lhlbf\") pod \"speaker-cd4d4\" (UID: \"dc9a0be8-5b8c-43d8-a670-06541535d7a0\") " pod="metallb-system/speaker-cd4d4" Mar 21 04:00:35 crc kubenswrapper[4685]: I0321 04:00:35.863478 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/7e226197-b1ce-49a2-a3b9-5aed3d774a12-metrics\") pod \"frr-k8s-tsjrs\" (UID: \"7e226197-b1ce-49a2-a3b9-5aed3d774a12\") " pod="metallb-system/frr-k8s-tsjrs" Mar 21 04:00:35 crc kubenswrapper[4685]: I0321 04:00:35.863499 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/7e226197-b1ce-49a2-a3b9-5aed3d774a12-reloader\") pod \"frr-k8s-tsjrs\" (UID: \"7e226197-b1ce-49a2-a3b9-5aed3d774a12\") " pod="metallb-system/frr-k8s-tsjrs" Mar 21 04:00:35 crc kubenswrapper[4685]: I0321 04:00:35.864076 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/7e226197-b1ce-49a2-a3b9-5aed3d774a12-metrics\") pod \"frr-k8s-tsjrs\" (UID: \"7e226197-b1ce-49a2-a3b9-5aed3d774a12\") " pod="metallb-system/frr-k8s-tsjrs" Mar 21 04:00:35 crc kubenswrapper[4685]: I0321 04:00:35.864117 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/7e226197-b1ce-49a2-a3b9-5aed3d774a12-reloader\") pod \"frr-k8s-tsjrs\" (UID: \"7e226197-b1ce-49a2-a3b9-5aed3d774a12\") " pod="metallb-system/frr-k8s-tsjrs" Mar 21 04:00:35 crc kubenswrapper[4685]: I0321 04:00:35.864253 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/7e226197-b1ce-49a2-a3b9-5aed3d774a12-frr-conf\") pod \"frr-k8s-tsjrs\" (UID: \"7e226197-b1ce-49a2-a3b9-5aed3d774a12\") " pod="metallb-system/frr-k8s-tsjrs" Mar 21 04:00:35 crc kubenswrapper[4685]: I0321 04:00:35.864306 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdbzv\" (UniqueName: \"kubernetes.io/projected/545a6f92-59ae-4ffb-824d-e493044c0082-kube-api-access-bdbzv\") pod \"frr-k8s-webhook-server-bcc4b6f68-rjxbv\" (UID: \"545a6f92-59ae-4ffb-824d-e493044c0082\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-rjxbv" Mar 21 04:00:35 crc kubenswrapper[4685]: I0321 04:00:35.864361 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/7e226197-b1ce-49a2-a3b9-5aed3d774a12-frr-startup\") pod \"frr-k8s-tsjrs\" (UID: \"7e226197-b1ce-49a2-a3b9-5aed3d774a12\") " pod="metallb-system/frr-k8s-tsjrs" Mar 21 04:00:35 crc kubenswrapper[4685]: I0321 04:00:35.864441 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/7e226197-b1ce-49a2-a3b9-5aed3d774a12-frr-sockets\") pod \"frr-k8s-tsjrs\" (UID: \"7e226197-b1ce-49a2-a3b9-5aed3d774a12\") " pod="metallb-system/frr-k8s-tsjrs" Mar 21 04:00:35 crc kubenswrapper[4685]: I0321 04:00:35.864490 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/76cca999-b151-46c4-b61b-b6249d75e2f5-cert\") pod \"controller-7bb4cc7c98-w4n8h\" (UID: \"76cca999-b151-46c4-b61b-b6249d75e2f5\") " pod="metallb-system/controller-7bb4cc7c98-w4n8h" Mar 21 04:00:35 crc kubenswrapper[4685]: I0321 04:00:35.864545 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7e226197-b1ce-49a2-a3b9-5aed3d774a12-metrics-certs\") pod \"frr-k8s-tsjrs\" (UID: \"7e226197-b1ce-49a2-a3b9-5aed3d774a12\") " pod="metallb-system/frr-k8s-tsjrs" Mar 21 04:00:35 crc kubenswrapper[4685]: I0321 04:00:35.864557 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/7e226197-b1ce-49a2-a3b9-5aed3d774a12-frr-conf\") pod \"frr-k8s-tsjrs\" (UID: \"7e226197-b1ce-49a2-a3b9-5aed3d774a12\") " pod="metallb-system/frr-k8s-tsjrs" Mar 21 04:00:35 crc kubenswrapper[4685]: I0321 04:00:35.864623 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/76cca999-b151-46c4-b61b-b6249d75e2f5-metrics-certs\") pod \"controller-7bb4cc7c98-w4n8h\" (UID: \"76cca999-b151-46c4-b61b-b6249d75e2f5\") " pod="metallb-system/controller-7bb4cc7c98-w4n8h" Mar 21 04:00:35 crc kubenswrapper[4685]: I0321 04:00:35.864663 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/dc9a0be8-5b8c-43d8-a670-06541535d7a0-memberlist\") pod \"speaker-cd4d4\" (UID: \"dc9a0be8-5b8c-43d8-a670-06541535d7a0\") " pod="metallb-system/speaker-cd4d4" Mar 21 04:00:35 crc kubenswrapper[4685]: I0321 04:00:35.864700 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkqfm\" (UniqueName: \"kubernetes.io/projected/76cca999-b151-46c4-b61b-b6249d75e2f5-kube-api-access-wkqfm\") pod \"controller-7bb4cc7c98-w4n8h\" (UID: \"76cca999-b151-46c4-b61b-b6249d75e2f5\") " pod="metallb-system/controller-7bb4cc7c98-w4n8h" Mar 21 04:00:35 crc kubenswrapper[4685]: I0321 04:00:35.864730 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dc9a0be8-5b8c-43d8-a670-06541535d7a0-metrics-certs\") pod \"speaker-cd4d4\" (UID: \"dc9a0be8-5b8c-43d8-a670-06541535d7a0\") " pod="metallb-system/speaker-cd4d4" Mar 21 04:00:35 crc kubenswrapper[4685]: I0321 04:00:35.864775 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drm2w\" (UniqueName: \"kubernetes.io/projected/7e226197-b1ce-49a2-a3b9-5aed3d774a12-kube-api-access-drm2w\") pod \"frr-k8s-tsjrs\" (UID: \"7e226197-b1ce-49a2-a3b9-5aed3d774a12\") " pod="metallb-system/frr-k8s-tsjrs" Mar 21 04:00:35 crc kubenswrapper[4685]: I0321 04:00:35.864823 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/dc9a0be8-5b8c-43d8-a670-06541535d7a0-metallb-excludel2\") pod \"speaker-cd4d4\" (UID: \"dc9a0be8-5b8c-43d8-a670-06541535d7a0\") " pod="metallb-system/speaker-cd4d4" Mar 21 04:00:35 crc kubenswrapper[4685]: I0321 04:00:35.864886 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/545a6f92-59ae-4ffb-824d-e493044c0082-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-rjxbv\" (UID: \"545a6f92-59ae-4ffb-824d-e493044c0082\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-rjxbv" Mar 21 04:00:35 crc kubenswrapper[4685]: I0321 04:00:35.865475 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/7e226197-b1ce-49a2-a3b9-5aed3d774a12-frr-startup\") pod \"frr-k8s-tsjrs\" (UID: \"7e226197-b1ce-49a2-a3b9-5aed3d774a12\") " pod="metallb-system/frr-k8s-tsjrs" Mar 21 04:00:35 crc kubenswrapper[4685]: I0321 04:00:35.865876 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/7e226197-b1ce-49a2-a3b9-5aed3d774a12-frr-sockets\") pod \"frr-k8s-tsjrs\" (UID: \"7e226197-b1ce-49a2-a3b9-5aed3d774a12\") " pod="metallb-system/frr-k8s-tsjrs" Mar 21 04:00:35 crc kubenswrapper[4685]: I0321 04:00:35.866260 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/dc9a0be8-5b8c-43d8-a670-06541535d7a0-metallb-excludel2\") pod \"speaker-cd4d4\" (UID: \"dc9a0be8-5b8c-43d8-a670-06541535d7a0\") " pod="metallb-system/speaker-cd4d4" Mar 21 04:00:35 crc kubenswrapper[4685]: E0321 04:00:35.866711 4685 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 21 04:00:35 crc kubenswrapper[4685]: E0321 04:00:35.866770 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dc9a0be8-5b8c-43d8-a670-06541535d7a0-memberlist podName:dc9a0be8-5b8c-43d8-a670-06541535d7a0 nodeName:}" failed. No retries permitted until 2026-03-21 04:00:36.36675326 +0000 UTC m=+868.843822072 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/dc9a0be8-5b8c-43d8-a670-06541535d7a0-memberlist") pod "speaker-cd4d4" (UID: "dc9a0be8-5b8c-43d8-a670-06541535d7a0") : secret "metallb-memberlist" not found Mar 21 04:00:35 crc kubenswrapper[4685]: I0321 04:00:35.868523 4685 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 21 04:00:35 crc kubenswrapper[4685]: I0321 04:00:35.871095 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7e226197-b1ce-49a2-a3b9-5aed3d774a12-metrics-certs\") pod \"frr-k8s-tsjrs\" (UID: \"7e226197-b1ce-49a2-a3b9-5aed3d774a12\") " pod="metallb-system/frr-k8s-tsjrs" Mar 21 04:00:35 crc kubenswrapper[4685]: I0321 04:00:35.871366 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/76cca999-b151-46c4-b61b-b6249d75e2f5-metrics-certs\") pod \"controller-7bb4cc7c98-w4n8h\" (UID: \"76cca999-b151-46c4-b61b-b6249d75e2f5\") " pod="metallb-system/controller-7bb4cc7c98-w4n8h" Mar 21 04:00:35 crc kubenswrapper[4685]: I0321 04:00:35.871698 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dc9a0be8-5b8c-43d8-a670-06541535d7a0-metrics-certs\") pod \"speaker-cd4d4\" (UID: \"dc9a0be8-5b8c-43d8-a670-06541535d7a0\") " pod="metallb-system/speaker-cd4d4" Mar 21 04:00:35 crc kubenswrapper[4685]: I0321 04:00:35.871734 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/545a6f92-59ae-4ffb-824d-e493044c0082-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-rjxbv\" (UID: \"545a6f92-59ae-4ffb-824d-e493044c0082\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-rjxbv" Mar 21 04:00:35 crc kubenswrapper[4685]: I0321 04:00:35.890307 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/76cca999-b151-46c4-b61b-b6249d75e2f5-cert\") pod \"controller-7bb4cc7c98-w4n8h\" (UID: \"76cca999-b151-46c4-b61b-b6249d75e2f5\") " pod="metallb-system/controller-7bb4cc7c98-w4n8h" Mar 21 04:00:35 crc kubenswrapper[4685]: I0321 04:00:35.890632 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhlbf\" (UniqueName: \"kubernetes.io/projected/dc9a0be8-5b8c-43d8-a670-06541535d7a0-kube-api-access-lhlbf\") pod \"speaker-cd4d4\" (UID: \"dc9a0be8-5b8c-43d8-a670-06541535d7a0\") " pod="metallb-system/speaker-cd4d4" Mar 21 04:00:35 crc kubenswrapper[4685]: I0321 04:00:35.891275 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkqfm\" (UniqueName: \"kubernetes.io/projected/76cca999-b151-46c4-b61b-b6249d75e2f5-kube-api-access-wkqfm\") pod \"controller-7bb4cc7c98-w4n8h\" (UID: \"76cca999-b151-46c4-b61b-b6249d75e2f5\") " pod="metallb-system/controller-7bb4cc7c98-w4n8h" Mar 21 04:00:35 crc kubenswrapper[4685]: I0321 04:00:35.891485 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdbzv\" (UniqueName: \"kubernetes.io/projected/545a6f92-59ae-4ffb-824d-e493044c0082-kube-api-access-bdbzv\") pod \"frr-k8s-webhook-server-bcc4b6f68-rjxbv\" (UID: \"545a6f92-59ae-4ffb-824d-e493044c0082\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-rjxbv" Mar 21 04:00:35 crc kubenswrapper[4685]: I0321 04:00:35.892625 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drm2w\" (UniqueName: \"kubernetes.io/projected/7e226197-b1ce-49a2-a3b9-5aed3d774a12-kube-api-access-drm2w\") pod \"frr-k8s-tsjrs\" (UID: \"7e226197-b1ce-49a2-a3b9-5aed3d774a12\") " pod="metallb-system/frr-k8s-tsjrs" Mar 21 04:00:35 crc kubenswrapper[4685]: I0321 04:00:35.904371 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-tsjrs" Mar 21 04:00:35 crc kubenswrapper[4685]: I0321 04:00:35.942570 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-rjxbv" Mar 21 04:00:36 crc kubenswrapper[4685]: I0321 04:00:36.005143 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-w4n8h" Mar 21 04:00:36 crc kubenswrapper[4685]: I0321 04:00:36.198728 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-rjxbv"] Mar 21 04:00:36 crc kubenswrapper[4685]: I0321 04:00:36.237271 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-w4n8h"] Mar 21 04:00:36 crc kubenswrapper[4685]: W0321 04:00:36.247096 4685 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76cca999_b151_46c4_b61b_b6249d75e2f5.slice/crio-075465a6a66522b786e2c6479bbd0b33dc6048270ad03d8c3c955ead3eb0cdda WatchSource:0}: Error finding container 075465a6a66522b786e2c6479bbd0b33dc6048270ad03d8c3c955ead3eb0cdda: Status 404 returned error can't find the container with id 075465a6a66522b786e2c6479bbd0b33dc6048270ad03d8c3c955ead3eb0cdda Mar 21 04:00:36 crc kubenswrapper[4685]: I0321 04:00:36.370162 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/dc9a0be8-5b8c-43d8-a670-06541535d7a0-memberlist\") pod \"speaker-cd4d4\" (UID: \"dc9a0be8-5b8c-43d8-a670-06541535d7a0\") " pod="metallb-system/speaker-cd4d4" Mar 21 04:00:36 crc kubenswrapper[4685]: E0321 04:00:36.370406 4685 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 21 04:00:36 crc kubenswrapper[4685]: E0321 04:00:36.370798 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dc9a0be8-5b8c-43d8-a670-06541535d7a0-memberlist podName:dc9a0be8-5b8c-43d8-a670-06541535d7a0 nodeName:}" failed. No retries permitted until 2026-03-21 04:00:37.370777618 +0000 UTC m=+869.847846500 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/dc9a0be8-5b8c-43d8-a670-06541535d7a0-memberlist") pod "speaker-cd4d4" (UID: "dc9a0be8-5b8c-43d8-a670-06541535d7a0") : secret "metallb-memberlist" not found Mar 21 04:00:37 crc kubenswrapper[4685]: I0321 04:00:37.086879 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-w4n8h" event={"ID":"76cca999-b151-46c4-b61b-b6249d75e2f5","Type":"ContainerStarted","Data":"91960a68dc7f203cf93b971ad85b1158aa4be58dcfbf2d35640f2d3754d47e6a"} Mar 21 04:00:37 crc kubenswrapper[4685]: I0321 04:00:37.086925 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-w4n8h" event={"ID":"76cca999-b151-46c4-b61b-b6249d75e2f5","Type":"ContainerStarted","Data":"075465a6a66522b786e2c6479bbd0b33dc6048270ad03d8c3c955ead3eb0cdda"} Mar 21 04:00:37 crc kubenswrapper[4685]: I0321 04:00:37.089202 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-tsjrs" event={"ID":"7e226197-b1ce-49a2-a3b9-5aed3d774a12","Type":"ContainerStarted","Data":"7d6fc14db7df773baf9471cc04246f81798450dcb915832cb4f5b62a325a80c6"} Mar 21 04:00:37 crc kubenswrapper[4685]: I0321 04:00:37.090298 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-rjxbv" event={"ID":"545a6f92-59ae-4ffb-824d-e493044c0082","Type":"ContainerStarted","Data":"4eea75275056433fd5e4e270df597161960c13bfffa8ad5df7c91ec506f17feb"} Mar 21 04:00:37 crc kubenswrapper[4685]: I0321 04:00:37.382178 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/dc9a0be8-5b8c-43d8-a670-06541535d7a0-memberlist\") pod \"speaker-cd4d4\" (UID: \"dc9a0be8-5b8c-43d8-a670-06541535d7a0\") " pod="metallb-system/speaker-cd4d4" Mar 21 04:00:37 crc kubenswrapper[4685]: I0321 04:00:37.390994 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/dc9a0be8-5b8c-43d8-a670-06541535d7a0-memberlist\") pod \"speaker-cd4d4\" (UID: \"dc9a0be8-5b8c-43d8-a670-06541535d7a0\") " pod="metallb-system/speaker-cd4d4" Mar 21 04:00:37 crc kubenswrapper[4685]: I0321 04:00:37.493316 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-cd4d4" Mar 21 04:00:37 crc kubenswrapper[4685]: W0321 04:00:37.510678 4685 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc9a0be8_5b8c_43d8_a670_06541535d7a0.slice/crio-d5290808d317e2e04cb1f204afcd9548ecea120c9823c5f8e758cbf01e3c952f WatchSource:0}: Error finding container d5290808d317e2e04cb1f204afcd9548ecea120c9823c5f8e758cbf01e3c952f: Status 404 returned error can't find the container with id d5290808d317e2e04cb1f204afcd9548ecea120c9823c5f8e758cbf01e3c952f Mar 21 04:00:38 crc kubenswrapper[4685]: I0321 04:00:38.105238 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-cd4d4" event={"ID":"dc9a0be8-5b8c-43d8-a670-06541535d7a0","Type":"ContainerStarted","Data":"5f0bde342318f6fb41ba1804a73085ccea7cea84b887146fd381eec622089456"} Mar 21 04:00:38 crc kubenswrapper[4685]: I0321 04:00:38.105282 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-cd4d4" event={"ID":"dc9a0be8-5b8c-43d8-a670-06541535d7a0","Type":"ContainerStarted","Data":"d5290808d317e2e04cb1f204afcd9548ecea120c9823c5f8e758cbf01e3c952f"} Mar 21 04:00:41 crc kubenswrapper[4685]: I0321 04:00:41.146713 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-cd4d4" event={"ID":"dc9a0be8-5b8c-43d8-a670-06541535d7a0","Type":"ContainerStarted","Data":"9b1830128fa13d39efb4abea6067484806ca4b5d966093de11f25d097676ae6a"} Mar 21 04:00:41 crc kubenswrapper[4685]: I0321 04:00:41.147232 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-cd4d4" Mar 21 04:00:41 crc kubenswrapper[4685]: I0321 04:00:41.150581 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-w4n8h" event={"ID":"76cca999-b151-46c4-b61b-b6249d75e2f5","Type":"ContainerStarted","Data":"ea2a2b8153f728b131bdd8423544f9e03a70f43fb1a5fa795df3b88348e7764d"} Mar 21 04:00:41 crc kubenswrapper[4685]: I0321 04:00:41.150713 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-7bb4cc7c98-w4n8h" Mar 21 04:00:41 crc kubenswrapper[4685]: I0321 04:00:41.183016 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-cd4d4" podStartSLOduration=3.251107372 podStartE2EDuration="6.182997672s" podCreationTimestamp="2026-03-21 04:00:35 +0000 UTC" firstStartedPulling="2026-03-21 04:00:37.698093343 +0000 UTC m=+870.175162135" lastFinishedPulling="2026-03-21 04:00:40.629983643 +0000 UTC m=+873.107052435" observedRunningTime="2026-03-21 04:00:41.165285418 +0000 UTC m=+873.642354210" watchObservedRunningTime="2026-03-21 04:00:41.182997672 +0000 UTC m=+873.660066464" Mar 21 04:00:41 crc kubenswrapper[4685]: I0321 04:00:41.183791 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-7bb4cc7c98-w4n8h" podStartSLOduration=1.920996395 podStartE2EDuration="6.183782894s" podCreationTimestamp="2026-03-21 04:00:35 +0000 UTC" firstStartedPulling="2026-03-21 04:00:36.359393688 +0000 UTC m=+868.836462500" lastFinishedPulling="2026-03-21 04:00:40.622180207 +0000 UTC m=+873.099248999" observedRunningTime="2026-03-21 04:00:41.179465359 +0000 UTC m=+873.656534161" watchObservedRunningTime="2026-03-21 04:00:41.183782894 +0000 UTC m=+873.660851686" Mar 21 04:00:44 crc kubenswrapper[4685]: I0321 04:00:44.179124 4685 generic.go:334] "Generic (PLEG): container finished" podID="7e226197-b1ce-49a2-a3b9-5aed3d774a12" containerID="fa540aedc2f7cf7922f3e78839f6538f449cc0256aa0f68147eea1507c1e6b53" exitCode=0 Mar 21 04:00:44 crc kubenswrapper[4685]: I0321 04:00:44.179188 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-tsjrs" event={"ID":"7e226197-b1ce-49a2-a3b9-5aed3d774a12","Type":"ContainerDied","Data":"fa540aedc2f7cf7922f3e78839f6538f449cc0256aa0f68147eea1507c1e6b53"} Mar 21 04:00:44 crc kubenswrapper[4685]: I0321 04:00:44.182561 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-rjxbv" event={"ID":"545a6f92-59ae-4ffb-824d-e493044c0082","Type":"ContainerStarted","Data":"aa227d08cfecc022a491a35b29761ffaefd528adbfbdeec43d87621cef2bdbb1"} Mar 21 04:00:44 crc kubenswrapper[4685]: I0321 04:00:44.182687 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-rjxbv" Mar 21 04:00:44 crc kubenswrapper[4685]: I0321 04:00:44.233340 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-rjxbv" podStartSLOduration=1.679978426 podStartE2EDuration="9.233314297s" podCreationTimestamp="2026-03-21 04:00:35 +0000 UTC" firstStartedPulling="2026-03-21 04:00:36.20842254 +0000 UTC m=+868.685491332" lastFinishedPulling="2026-03-21 04:00:43.761758411 +0000 UTC m=+876.238827203" observedRunningTime="2026-03-21 04:00:44.233021768 +0000 UTC m=+876.710090600" watchObservedRunningTime="2026-03-21 04:00:44.233314297 +0000 UTC m=+876.710383119" Mar 21 04:00:45 crc kubenswrapper[4685]: I0321 04:00:45.191626 4685 generic.go:334] "Generic (PLEG): container finished" podID="7e226197-b1ce-49a2-a3b9-5aed3d774a12" containerID="46942f2b491eedd9cfab73c90c9a04f2c06d7b6c5119d756ed38401a20abdd28" exitCode=0 Mar 21 04:00:45 crc kubenswrapper[4685]: I0321 04:00:45.191694 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-tsjrs" event={"ID":"7e226197-b1ce-49a2-a3b9-5aed3d774a12","Type":"ContainerDied","Data":"46942f2b491eedd9cfab73c90c9a04f2c06d7b6c5119d756ed38401a20abdd28"} Mar 21 04:00:46 crc kubenswrapper[4685]: I0321 04:00:46.011713 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-7bb4cc7c98-w4n8h" Mar 21 04:00:46 crc kubenswrapper[4685]: I0321 04:00:46.198749 4685 generic.go:334] "Generic (PLEG): container finished" podID="7e226197-b1ce-49a2-a3b9-5aed3d774a12" containerID="27c677b2b8870fde1282d5b70a32f9cfe9cfc90e0ee03a829eb59179c93c1931" exitCode=0 Mar 21 04:00:46 crc kubenswrapper[4685]: I0321 04:00:46.198793 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-tsjrs" event={"ID":"7e226197-b1ce-49a2-a3b9-5aed3d774a12","Type":"ContainerDied","Data":"27c677b2b8870fde1282d5b70a32f9cfe9cfc90e0ee03a829eb59179c93c1931"} Mar 21 04:00:47 crc kubenswrapper[4685]: I0321 04:00:47.211268 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-tsjrs" event={"ID":"7e226197-b1ce-49a2-a3b9-5aed3d774a12","Type":"ContainerStarted","Data":"c337437f766e5eaa9930dbd9fb1e1f7156ea6c2bfa09ea92dea3150cec48e18b"} Mar 21 04:00:47 crc kubenswrapper[4685]: I0321 04:00:47.212007 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-tsjrs" event={"ID":"7e226197-b1ce-49a2-a3b9-5aed3d774a12","Type":"ContainerStarted","Data":"61d34ac7d7cc4dce10f3a14ec5ccad53ccc16b38c4d5b41d4b6ba0959566bbe8"} Mar 21 04:00:47 crc kubenswrapper[4685]: I0321 04:00:47.212027 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-tsjrs" event={"ID":"7e226197-b1ce-49a2-a3b9-5aed3d774a12","Type":"ContainerStarted","Data":"b151afacf6fd116f64c90f76f91edc184e52ee3ced72107ae716adf87d9459fe"} Mar 21 04:00:47 crc kubenswrapper[4685]: I0321 04:00:47.212066 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-tsjrs" event={"ID":"7e226197-b1ce-49a2-a3b9-5aed3d774a12","Type":"ContainerStarted","Data":"d694d630a83f9692cf1b5e82f62e8815e82f2622f0b89e55020d612579bdaf25"} Mar 21 04:00:47 crc kubenswrapper[4685]: I0321 04:00:47.212077 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-tsjrs" event={"ID":"7e226197-b1ce-49a2-a3b9-5aed3d774a12","Type":"ContainerStarted","Data":"201d0fd458b9c8821ccb7a170d78cbe26eaeb17bc245033116704834dc98482f"} Mar 21 04:00:47 crc kubenswrapper[4685]: I0321 04:00:47.500043 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-cd4d4" Mar 21 04:00:48 crc kubenswrapper[4685]: I0321 04:00:48.222123 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-tsjrs" event={"ID":"7e226197-b1ce-49a2-a3b9-5aed3d774a12","Type":"ContainerStarted","Data":"caab116f28af943dfc288b331be6f134c675528ee603a9d902ce86e980390b4c"} Mar 21 04:00:48 crc kubenswrapper[4685]: I0321 04:00:48.222313 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-tsjrs" Mar 21 04:00:48 crc kubenswrapper[4685]: I0321 04:00:48.251845 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-tsjrs" podStartSLOduration=5.639040708 podStartE2EDuration="13.251813012s" podCreationTimestamp="2026-03-21 04:00:35 +0000 UTC" firstStartedPulling="2026-03-21 04:00:36.081987283 +0000 UTC m=+868.559056075" lastFinishedPulling="2026-03-21 04:00:43.694759587 +0000 UTC m=+876.171828379" observedRunningTime="2026-03-21 04:00:48.249119764 +0000 UTC m=+880.726188556" watchObservedRunningTime="2026-03-21 04:00:48.251813012 +0000 UTC m=+880.728881804" Mar 21 04:00:50 crc kubenswrapper[4685]: I0321 04:00:50.904805 4685 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-tsjrs" Mar 21 04:00:50 crc kubenswrapper[4685]: I0321 04:00:50.942130 4685 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-tsjrs" Mar 21 04:00:53 crc kubenswrapper[4685]: I0321 04:00:53.143286 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-index-pl5xp"] Mar 21 04:00:53 crc kubenswrapper[4685]: I0321 04:00:53.144404 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-pl5xp" Mar 21 04:00:53 crc kubenswrapper[4685]: I0321 04:00:53.147878 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 21 04:00:53 crc kubenswrapper[4685]: I0321 04:00:53.150882 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-index-dockercfg-6dz2f" Mar 21 04:00:53 crc kubenswrapper[4685]: I0321 04:00:53.151062 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 21 04:00:53 crc kubenswrapper[4685]: I0321 04:00:53.155004 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-pl5xp"] Mar 21 04:00:53 crc kubenswrapper[4685]: I0321 04:00:53.311975 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vbpb\" (UniqueName: \"kubernetes.io/projected/11ac48ad-4459-49c1-b05e-cb82025f031b-kube-api-access-7vbpb\") pod \"mariadb-operator-index-pl5xp\" (UID: \"11ac48ad-4459-49c1-b05e-cb82025f031b\") " pod="openstack-operators/mariadb-operator-index-pl5xp" Mar 21 04:00:53 crc kubenswrapper[4685]: I0321 04:00:53.412883 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vbpb\" (UniqueName: \"kubernetes.io/projected/11ac48ad-4459-49c1-b05e-cb82025f031b-kube-api-access-7vbpb\") pod \"mariadb-operator-index-pl5xp\" (UID: \"11ac48ad-4459-49c1-b05e-cb82025f031b\") " pod="openstack-operators/mariadb-operator-index-pl5xp" Mar 21 04:00:53 crc kubenswrapper[4685]: I0321 04:00:53.436340 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vbpb\" (UniqueName: \"kubernetes.io/projected/11ac48ad-4459-49c1-b05e-cb82025f031b-kube-api-access-7vbpb\") pod \"mariadb-operator-index-pl5xp\" (UID: \"11ac48ad-4459-49c1-b05e-cb82025f031b\") " pod="openstack-operators/mariadb-operator-index-pl5xp" Mar 21 04:00:53 crc kubenswrapper[4685]: I0321 04:00:53.466340 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-pl5xp" Mar 21 04:00:53 crc kubenswrapper[4685]: I0321 04:00:53.696248 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-pl5xp"] Mar 21 04:00:53 crc kubenswrapper[4685]: W0321 04:00:53.702229 4685 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11ac48ad_4459_49c1_b05e_cb82025f031b.slice/crio-f5f4630df2bc21e7885d907c111147221c29ac6076662e1e31f6f47f4a4fbd07 WatchSource:0}: Error finding container f5f4630df2bc21e7885d907c111147221c29ac6076662e1e31f6f47f4a4fbd07: Status 404 returned error can't find the container with id f5f4630df2bc21e7885d907c111147221c29ac6076662e1e31f6f47f4a4fbd07 Mar 21 04:00:54 crc kubenswrapper[4685]: I0321 04:00:54.262570 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-pl5xp" event={"ID":"11ac48ad-4459-49c1-b05e-cb82025f031b","Type":"ContainerStarted","Data":"f5f4630df2bc21e7885d907c111147221c29ac6076662e1e31f6f47f4a4fbd07"} Mar 21 04:00:55 crc kubenswrapper[4685]: I0321 04:00:55.273413 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-pl5xp" event={"ID":"11ac48ad-4459-49c1-b05e-cb82025f031b","Type":"ContainerStarted","Data":"8220354cb3b334ec687bb42d99adda5245fb2c53e8fd73b4ea6c8d16725b3476"} Mar 21 04:00:55 crc kubenswrapper[4685]: I0321 04:00:55.289888 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-index-pl5xp" podStartSLOduration=1.273091053 podStartE2EDuration="2.28987091s" podCreationTimestamp="2026-03-21 04:00:53 +0000 UTC" firstStartedPulling="2026-03-21 04:00:53.704221464 +0000 UTC m=+886.181290256" lastFinishedPulling="2026-03-21 04:00:54.721001321 +0000 UTC m=+887.198070113" observedRunningTime="2026-03-21 04:00:55.286354928 +0000 UTC m=+887.763423750" watchObservedRunningTime="2026-03-21 04:00:55.28987091 +0000 UTC m=+887.766939702" Mar 21 04:00:55 crc kubenswrapper[4685]: I0321 04:00:55.948162 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-rjxbv" Mar 21 04:00:56 crc kubenswrapper[4685]: I0321 04:00:56.529565 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-pl5xp"] Mar 21 04:00:57 crc kubenswrapper[4685]: I0321 04:00:57.132325 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-index-v8f9z"] Mar 21 04:00:57 crc kubenswrapper[4685]: I0321 04:00:57.133157 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-v8f9z" Mar 21 04:00:57 crc kubenswrapper[4685]: I0321 04:00:57.142228 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-v8f9z"] Mar 21 04:00:57 crc kubenswrapper[4685]: I0321 04:00:57.261387 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjm5c\" (UniqueName: \"kubernetes.io/projected/0a32f016-2085-4f24-83d7-68a6a01d1f02-kube-api-access-qjm5c\") pod \"mariadb-operator-index-v8f9z\" (UID: \"0a32f016-2085-4f24-83d7-68a6a01d1f02\") " pod="openstack-operators/mariadb-operator-index-v8f9z" Mar 21 04:00:57 crc kubenswrapper[4685]: I0321 04:00:57.284302 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/mariadb-operator-index-pl5xp" podUID="11ac48ad-4459-49c1-b05e-cb82025f031b" containerName="registry-server" containerID="cri-o://8220354cb3b334ec687bb42d99adda5245fb2c53e8fd73b4ea6c8d16725b3476" gracePeriod=2 Mar 21 04:00:57 crc kubenswrapper[4685]: I0321 04:00:57.362314 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjm5c\" (UniqueName: \"kubernetes.io/projected/0a32f016-2085-4f24-83d7-68a6a01d1f02-kube-api-access-qjm5c\") pod \"mariadb-operator-index-v8f9z\" (UID: \"0a32f016-2085-4f24-83d7-68a6a01d1f02\") " pod="openstack-operators/mariadb-operator-index-v8f9z" Mar 21 04:00:57 crc kubenswrapper[4685]: I0321 04:00:57.379981 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjm5c\" (UniqueName: \"kubernetes.io/projected/0a32f016-2085-4f24-83d7-68a6a01d1f02-kube-api-access-qjm5c\") pod \"mariadb-operator-index-v8f9z\" (UID: \"0a32f016-2085-4f24-83d7-68a6a01d1f02\") " pod="openstack-operators/mariadb-operator-index-v8f9z" Mar 21 04:00:57 crc kubenswrapper[4685]: I0321 04:00:57.450996 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-v8f9z" Mar 21 04:00:57 crc kubenswrapper[4685]: I0321 04:00:57.665711 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-pl5xp" Mar 21 04:00:57 crc kubenswrapper[4685]: I0321 04:00:57.668219 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vbpb\" (UniqueName: \"kubernetes.io/projected/11ac48ad-4459-49c1-b05e-cb82025f031b-kube-api-access-7vbpb\") pod \"11ac48ad-4459-49c1-b05e-cb82025f031b\" (UID: \"11ac48ad-4459-49c1-b05e-cb82025f031b\") " Mar 21 04:00:57 crc kubenswrapper[4685]: I0321 04:00:57.673676 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11ac48ad-4459-49c1-b05e-cb82025f031b-kube-api-access-7vbpb" (OuterVolumeSpecName: "kube-api-access-7vbpb") pod "11ac48ad-4459-49c1-b05e-cb82025f031b" (UID: "11ac48ad-4459-49c1-b05e-cb82025f031b"). InnerVolumeSpecName "kube-api-access-7vbpb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:00:57 crc kubenswrapper[4685]: I0321 04:00:57.716830 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-v8f9z"] Mar 21 04:00:57 crc kubenswrapper[4685]: I0321 04:00:57.769236 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7vbpb\" (UniqueName: \"kubernetes.io/projected/11ac48ad-4459-49c1-b05e-cb82025f031b-kube-api-access-7vbpb\") on node \"crc\" DevicePath \"\"" Mar 21 04:00:58 crc kubenswrapper[4685]: I0321 04:00:58.289882 4685 generic.go:334] "Generic (PLEG): container finished" podID="11ac48ad-4459-49c1-b05e-cb82025f031b" containerID="8220354cb3b334ec687bb42d99adda5245fb2c53e8fd73b4ea6c8d16725b3476" exitCode=0 Mar 21 04:00:58 crc kubenswrapper[4685]: I0321 04:00:58.289957 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-pl5xp" event={"ID":"11ac48ad-4459-49c1-b05e-cb82025f031b","Type":"ContainerDied","Data":"8220354cb3b334ec687bb42d99adda5245fb2c53e8fd73b4ea6c8d16725b3476"} Mar 21 04:00:58 crc kubenswrapper[4685]: I0321 04:00:58.289966 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-pl5xp" Mar 21 04:00:58 crc kubenswrapper[4685]: I0321 04:00:58.289985 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-pl5xp" event={"ID":"11ac48ad-4459-49c1-b05e-cb82025f031b","Type":"ContainerDied","Data":"f5f4630df2bc21e7885d907c111147221c29ac6076662e1e31f6f47f4a4fbd07"} Mar 21 04:00:58 crc kubenswrapper[4685]: I0321 04:00:58.290002 4685 scope.go:117] "RemoveContainer" containerID="8220354cb3b334ec687bb42d99adda5245fb2c53e8fd73b4ea6c8d16725b3476" Mar 21 04:00:58 crc kubenswrapper[4685]: I0321 04:00:58.291025 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-v8f9z" event={"ID":"0a32f016-2085-4f24-83d7-68a6a01d1f02","Type":"ContainerStarted","Data":"12cda35357d8475a4c1eacac304a470236d20996b00d97bc5086a2e4ac8ec7f7"} Mar 21 04:00:58 crc kubenswrapper[4685]: I0321 04:00:58.317520 4685 scope.go:117] "RemoveContainer" containerID="8220354cb3b334ec687bb42d99adda5245fb2c53e8fd73b4ea6c8d16725b3476" Mar 21 04:00:58 crc kubenswrapper[4685]: E0321 04:00:58.317954 4685 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8220354cb3b334ec687bb42d99adda5245fb2c53e8fd73b4ea6c8d16725b3476\": container with ID starting with 8220354cb3b334ec687bb42d99adda5245fb2c53e8fd73b4ea6c8d16725b3476 not found: ID does not exist" containerID="8220354cb3b334ec687bb42d99adda5245fb2c53e8fd73b4ea6c8d16725b3476" Mar 21 04:00:58 crc kubenswrapper[4685]: I0321 04:00:58.318265 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8220354cb3b334ec687bb42d99adda5245fb2c53e8fd73b4ea6c8d16725b3476"} err="failed to get container status \"8220354cb3b334ec687bb42d99adda5245fb2c53e8fd73b4ea6c8d16725b3476\": rpc error: code = NotFound desc = could not find container \"8220354cb3b334ec687bb42d99adda5245fb2c53e8fd73b4ea6c8d16725b3476\": container with ID starting with 8220354cb3b334ec687bb42d99adda5245fb2c53e8fd73b4ea6c8d16725b3476 not found: ID does not exist" Mar 21 04:00:58 crc kubenswrapper[4685]: I0321 04:00:58.346202 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-pl5xp"] Mar 21 04:00:58 crc kubenswrapper[4685]: I0321 04:00:58.351202 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/mariadb-operator-index-pl5xp"] Mar 21 04:00:59 crc kubenswrapper[4685]: I0321 04:00:59.303328 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-v8f9z" event={"ID":"0a32f016-2085-4f24-83d7-68a6a01d1f02","Type":"ContainerStarted","Data":"9c4fa5b79b74d881d929682dcc86b2b780b6847ad69b900b5a0deabf5967e119"} Mar 21 04:00:59 crc kubenswrapper[4685]: I0321 04:00:59.322896 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-index-v8f9z" podStartSLOduration=1.661630098 podStartE2EDuration="2.322879805s" podCreationTimestamp="2026-03-21 04:00:57 +0000 UTC" firstStartedPulling="2026-03-21 04:00:57.729481604 +0000 UTC m=+890.206550396" lastFinishedPulling="2026-03-21 04:00:58.390731311 +0000 UTC m=+890.867800103" observedRunningTime="2026-03-21 04:00:59.322171145 +0000 UTC m=+891.799239937" watchObservedRunningTime="2026-03-21 04:00:59.322879805 +0000 UTC m=+891.799948597" Mar 21 04:01:00 crc kubenswrapper[4685]: I0321 04:01:00.312124 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11ac48ad-4459-49c1-b05e-cb82025f031b" path="/var/lib/kubelet/pods/11ac48ad-4459-49c1-b05e-cb82025f031b/volumes" Mar 21 04:01:05 crc kubenswrapper[4685]: I0321 04:01:05.907729 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-tsjrs" Mar 21 04:01:07 crc kubenswrapper[4685]: I0321 04:01:07.451598 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-index-v8f9z" Mar 21 04:01:07 crc kubenswrapper[4685]: I0321 04:01:07.451809 4685 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/mariadb-operator-index-v8f9z" Mar 21 04:01:07 crc kubenswrapper[4685]: I0321 04:01:07.502623 4685 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/mariadb-operator-index-v8f9z" Mar 21 04:01:08 crc kubenswrapper[4685]: I0321 04:01:08.413604 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-index-v8f9z" Mar 21 04:01:15 crc kubenswrapper[4685]: I0321 04:01:15.983758 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/7e3b51b6a8afd7c782c20eee88f3ff29a8534df38096cf089dfa437de6m8ksc"] Mar 21 04:01:15 crc kubenswrapper[4685]: E0321 04:01:15.984594 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11ac48ad-4459-49c1-b05e-cb82025f031b" containerName="registry-server" Mar 21 04:01:15 crc kubenswrapper[4685]: I0321 04:01:15.984612 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="11ac48ad-4459-49c1-b05e-cb82025f031b" containerName="registry-server" Mar 21 04:01:15 crc kubenswrapper[4685]: I0321 04:01:15.984789 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="11ac48ad-4459-49c1-b05e-cb82025f031b" containerName="registry-server" Mar 21 04:01:15 crc kubenswrapper[4685]: I0321 04:01:15.986028 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7e3b51b6a8afd7c782c20eee88f3ff29a8534df38096cf089dfa437de6m8ksc" Mar 21 04:01:15 crc kubenswrapper[4685]: I0321 04:01:15.989609 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-2vwsz" Mar 21 04:01:15 crc kubenswrapper[4685]: I0321 04:01:15.992955 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/7e3b51b6a8afd7c782c20eee88f3ff29a8534df38096cf089dfa437de6m8ksc"] Mar 21 04:01:16 crc kubenswrapper[4685]: I0321 04:01:16.106966 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1576ba2d-61cd-4dde-b4ce-eab38b64a3f2-bundle\") pod \"7e3b51b6a8afd7c782c20eee88f3ff29a8534df38096cf089dfa437de6m8ksc\" (UID: \"1576ba2d-61cd-4dde-b4ce-eab38b64a3f2\") " pod="openstack-operators/7e3b51b6a8afd7c782c20eee88f3ff29a8534df38096cf089dfa437de6m8ksc" Mar 21 04:01:16 crc kubenswrapper[4685]: I0321 04:01:16.107076 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8h2tj\" (UniqueName: \"kubernetes.io/projected/1576ba2d-61cd-4dde-b4ce-eab38b64a3f2-kube-api-access-8h2tj\") pod \"7e3b51b6a8afd7c782c20eee88f3ff29a8534df38096cf089dfa437de6m8ksc\" (UID: \"1576ba2d-61cd-4dde-b4ce-eab38b64a3f2\") " pod="openstack-operators/7e3b51b6a8afd7c782c20eee88f3ff29a8534df38096cf089dfa437de6m8ksc" Mar 21 04:01:16 crc kubenswrapper[4685]: I0321 04:01:16.107119 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1576ba2d-61cd-4dde-b4ce-eab38b64a3f2-util\") pod \"7e3b51b6a8afd7c782c20eee88f3ff29a8534df38096cf089dfa437de6m8ksc\" (UID: \"1576ba2d-61cd-4dde-b4ce-eab38b64a3f2\") " pod="openstack-operators/7e3b51b6a8afd7c782c20eee88f3ff29a8534df38096cf089dfa437de6m8ksc" Mar 21 04:01:16 crc kubenswrapper[4685]: I0321 04:01:16.207685 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1576ba2d-61cd-4dde-b4ce-eab38b64a3f2-bundle\") pod \"7e3b51b6a8afd7c782c20eee88f3ff29a8534df38096cf089dfa437de6m8ksc\" (UID: \"1576ba2d-61cd-4dde-b4ce-eab38b64a3f2\") " pod="openstack-operators/7e3b51b6a8afd7c782c20eee88f3ff29a8534df38096cf089dfa437de6m8ksc" Mar 21 04:01:16 crc kubenswrapper[4685]: I0321 04:01:16.207750 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8h2tj\" (UniqueName: \"kubernetes.io/projected/1576ba2d-61cd-4dde-b4ce-eab38b64a3f2-kube-api-access-8h2tj\") pod \"7e3b51b6a8afd7c782c20eee88f3ff29a8534df38096cf089dfa437de6m8ksc\" (UID: \"1576ba2d-61cd-4dde-b4ce-eab38b64a3f2\") " pod="openstack-operators/7e3b51b6a8afd7c782c20eee88f3ff29a8534df38096cf089dfa437de6m8ksc" Mar 21 04:01:16 crc kubenswrapper[4685]: I0321 04:01:16.207778 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1576ba2d-61cd-4dde-b4ce-eab38b64a3f2-util\") pod \"7e3b51b6a8afd7c782c20eee88f3ff29a8534df38096cf089dfa437de6m8ksc\" (UID: \"1576ba2d-61cd-4dde-b4ce-eab38b64a3f2\") " pod="openstack-operators/7e3b51b6a8afd7c782c20eee88f3ff29a8534df38096cf089dfa437de6m8ksc" Mar 21 04:01:16 crc kubenswrapper[4685]: I0321 04:01:16.208383 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1576ba2d-61cd-4dde-b4ce-eab38b64a3f2-util\") pod \"7e3b51b6a8afd7c782c20eee88f3ff29a8534df38096cf089dfa437de6m8ksc\" (UID: \"1576ba2d-61cd-4dde-b4ce-eab38b64a3f2\") " pod="openstack-operators/7e3b51b6a8afd7c782c20eee88f3ff29a8534df38096cf089dfa437de6m8ksc" Mar 21 04:01:16 crc kubenswrapper[4685]: I0321 04:01:16.208389 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1576ba2d-61cd-4dde-b4ce-eab38b64a3f2-bundle\") pod \"7e3b51b6a8afd7c782c20eee88f3ff29a8534df38096cf089dfa437de6m8ksc\" (UID: \"1576ba2d-61cd-4dde-b4ce-eab38b64a3f2\") " pod="openstack-operators/7e3b51b6a8afd7c782c20eee88f3ff29a8534df38096cf089dfa437de6m8ksc" Mar 21 04:01:16 crc kubenswrapper[4685]: I0321 04:01:16.241557 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8h2tj\" (UniqueName: \"kubernetes.io/projected/1576ba2d-61cd-4dde-b4ce-eab38b64a3f2-kube-api-access-8h2tj\") pod \"7e3b51b6a8afd7c782c20eee88f3ff29a8534df38096cf089dfa437de6m8ksc\" (UID: \"1576ba2d-61cd-4dde-b4ce-eab38b64a3f2\") " pod="openstack-operators/7e3b51b6a8afd7c782c20eee88f3ff29a8534df38096cf089dfa437de6m8ksc" Mar 21 04:01:16 crc kubenswrapper[4685]: I0321 04:01:16.307538 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7e3b51b6a8afd7c782c20eee88f3ff29a8534df38096cf089dfa437de6m8ksc" Mar 21 04:01:16 crc kubenswrapper[4685]: I0321 04:01:16.752163 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/7e3b51b6a8afd7c782c20eee88f3ff29a8534df38096cf089dfa437de6m8ksc"] Mar 21 04:01:17 crc kubenswrapper[4685]: I0321 04:01:17.455697 4685 generic.go:334] "Generic (PLEG): container finished" podID="1576ba2d-61cd-4dde-b4ce-eab38b64a3f2" containerID="a70e6b6534a9f9dd0ca29ac76fb7b519a688f958c1e1792d75fc98b6116e8d58" exitCode=0 Mar 21 04:01:17 crc kubenswrapper[4685]: I0321 04:01:17.455762 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7e3b51b6a8afd7c782c20eee88f3ff29a8534df38096cf089dfa437de6m8ksc" event={"ID":"1576ba2d-61cd-4dde-b4ce-eab38b64a3f2","Type":"ContainerDied","Data":"a70e6b6534a9f9dd0ca29ac76fb7b519a688f958c1e1792d75fc98b6116e8d58"} Mar 21 04:01:17 crc kubenswrapper[4685]: I0321 04:01:17.455857 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7e3b51b6a8afd7c782c20eee88f3ff29a8534df38096cf089dfa437de6m8ksc" event={"ID":"1576ba2d-61cd-4dde-b4ce-eab38b64a3f2","Type":"ContainerStarted","Data":"dfdcdf8e912519cbcead21df7f1269d84d40bd17943d14b83180bc4fd66578de"} Mar 21 04:01:17 crc kubenswrapper[4685]: I0321 04:01:17.459820 4685 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 21 04:01:18 crc kubenswrapper[4685]: I0321 04:01:18.462231 4685 generic.go:334] "Generic (PLEG): container finished" podID="1576ba2d-61cd-4dde-b4ce-eab38b64a3f2" containerID="dba5e2417eade1eaaa37fce7e18ce3624852e3e52aa73be129bce881364f51ac" exitCode=0 Mar 21 04:01:18 crc kubenswrapper[4685]: I0321 04:01:18.462293 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7e3b51b6a8afd7c782c20eee88f3ff29a8534df38096cf089dfa437de6m8ksc" event={"ID":"1576ba2d-61cd-4dde-b4ce-eab38b64a3f2","Type":"ContainerDied","Data":"dba5e2417eade1eaaa37fce7e18ce3624852e3e52aa73be129bce881364f51ac"} Mar 21 04:01:19 crc kubenswrapper[4685]: I0321 04:01:19.480056 4685 generic.go:334] "Generic (PLEG): container finished" podID="1576ba2d-61cd-4dde-b4ce-eab38b64a3f2" containerID="59c344cffa029dbb85f2621534d4872f749153ca1d3368b50a888fa846e9a355" exitCode=0 Mar 21 04:01:19 crc kubenswrapper[4685]: I0321 04:01:19.480125 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7e3b51b6a8afd7c782c20eee88f3ff29a8534df38096cf089dfa437de6m8ksc" event={"ID":"1576ba2d-61cd-4dde-b4ce-eab38b64a3f2","Type":"ContainerDied","Data":"59c344cffa029dbb85f2621534d4872f749153ca1d3368b50a888fa846e9a355"} Mar 21 04:01:20 crc kubenswrapper[4685]: I0321 04:01:20.763735 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7e3b51b6a8afd7c782c20eee88f3ff29a8534df38096cf089dfa437de6m8ksc" Mar 21 04:01:20 crc kubenswrapper[4685]: I0321 04:01:20.868436 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1576ba2d-61cd-4dde-b4ce-eab38b64a3f2-bundle\") pod \"1576ba2d-61cd-4dde-b4ce-eab38b64a3f2\" (UID: \"1576ba2d-61cd-4dde-b4ce-eab38b64a3f2\") " Mar 21 04:01:20 crc kubenswrapper[4685]: I0321 04:01:20.868567 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1576ba2d-61cd-4dde-b4ce-eab38b64a3f2-util\") pod \"1576ba2d-61cd-4dde-b4ce-eab38b64a3f2\" (UID: \"1576ba2d-61cd-4dde-b4ce-eab38b64a3f2\") " Mar 21 04:01:20 crc kubenswrapper[4685]: I0321 04:01:20.868807 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8h2tj\" (UniqueName: \"kubernetes.io/projected/1576ba2d-61cd-4dde-b4ce-eab38b64a3f2-kube-api-access-8h2tj\") pod \"1576ba2d-61cd-4dde-b4ce-eab38b64a3f2\" (UID: \"1576ba2d-61cd-4dde-b4ce-eab38b64a3f2\") " Mar 21 04:01:20 crc kubenswrapper[4685]: I0321 04:01:20.870405 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1576ba2d-61cd-4dde-b4ce-eab38b64a3f2-bundle" (OuterVolumeSpecName: "bundle") pod "1576ba2d-61cd-4dde-b4ce-eab38b64a3f2" (UID: "1576ba2d-61cd-4dde-b4ce-eab38b64a3f2"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:01:20 crc kubenswrapper[4685]: I0321 04:01:20.877563 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1576ba2d-61cd-4dde-b4ce-eab38b64a3f2-kube-api-access-8h2tj" (OuterVolumeSpecName: "kube-api-access-8h2tj") pod "1576ba2d-61cd-4dde-b4ce-eab38b64a3f2" (UID: "1576ba2d-61cd-4dde-b4ce-eab38b64a3f2"). InnerVolumeSpecName "kube-api-access-8h2tj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:01:20 crc kubenswrapper[4685]: I0321 04:01:20.890034 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1576ba2d-61cd-4dde-b4ce-eab38b64a3f2-util" (OuterVolumeSpecName: "util") pod "1576ba2d-61cd-4dde-b4ce-eab38b64a3f2" (UID: "1576ba2d-61cd-4dde-b4ce-eab38b64a3f2"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:01:20 crc kubenswrapper[4685]: I0321 04:01:20.970578 4685 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1576ba2d-61cd-4dde-b4ce-eab38b64a3f2-util\") on node \"crc\" DevicePath \"\"" Mar 21 04:01:20 crc kubenswrapper[4685]: I0321 04:01:20.970626 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8h2tj\" (UniqueName: \"kubernetes.io/projected/1576ba2d-61cd-4dde-b4ce-eab38b64a3f2-kube-api-access-8h2tj\") on node \"crc\" DevicePath \"\"" Mar 21 04:01:20 crc kubenswrapper[4685]: I0321 04:01:20.970660 4685 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1576ba2d-61cd-4dde-b4ce-eab38b64a3f2-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:01:21 crc kubenswrapper[4685]: I0321 04:01:21.495132 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7e3b51b6a8afd7c782c20eee88f3ff29a8534df38096cf089dfa437de6m8ksc" event={"ID":"1576ba2d-61cd-4dde-b4ce-eab38b64a3f2","Type":"ContainerDied","Data":"dfdcdf8e912519cbcead21df7f1269d84d40bd17943d14b83180bc4fd66578de"} Mar 21 04:01:21 crc kubenswrapper[4685]: I0321 04:01:21.495167 4685 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dfdcdf8e912519cbcead21df7f1269d84d40bd17943d14b83180bc4fd66578de" Mar 21 04:01:21 crc kubenswrapper[4685]: I0321 04:01:21.495200 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7e3b51b6a8afd7c782c20eee88f3ff29a8534df38096cf089dfa437de6m8ksc" Mar 21 04:01:29 crc kubenswrapper[4685]: I0321 04:01:29.280031 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6697764dc7-vzzbc"] Mar 21 04:01:29 crc kubenswrapper[4685]: E0321 04:01:29.280692 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1576ba2d-61cd-4dde-b4ce-eab38b64a3f2" containerName="extract" Mar 21 04:01:29 crc kubenswrapper[4685]: I0321 04:01:29.280708 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="1576ba2d-61cd-4dde-b4ce-eab38b64a3f2" containerName="extract" Mar 21 04:01:29 crc kubenswrapper[4685]: E0321 04:01:29.280727 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1576ba2d-61cd-4dde-b4ce-eab38b64a3f2" containerName="pull" Mar 21 04:01:29 crc kubenswrapper[4685]: I0321 04:01:29.280736 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="1576ba2d-61cd-4dde-b4ce-eab38b64a3f2" containerName="pull" Mar 21 04:01:29 crc kubenswrapper[4685]: E0321 04:01:29.280759 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1576ba2d-61cd-4dde-b4ce-eab38b64a3f2" containerName="util" Mar 21 04:01:29 crc kubenswrapper[4685]: I0321 04:01:29.280768 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="1576ba2d-61cd-4dde-b4ce-eab38b64a3f2" containerName="util" Mar 21 04:01:29 crc kubenswrapper[4685]: I0321 04:01:29.281091 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="1576ba2d-61cd-4dde-b4ce-eab38b64a3f2" containerName="extract" Mar 21 04:01:29 crc kubenswrapper[4685]: I0321 04:01:29.281532 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6697764dc7-vzzbc" Mar 21 04:01:29 crc kubenswrapper[4685]: I0321 04:01:29.284183 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 21 04:01:29 crc kubenswrapper[4685]: I0321 04:01:29.284211 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-service-cert" Mar 21 04:01:29 crc kubenswrapper[4685]: I0321 04:01:29.284228 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-fwgh5" Mar 21 04:01:29 crc kubenswrapper[4685]: I0321 04:01:29.297678 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6697764dc7-vzzbc"] Mar 21 04:01:29 crc kubenswrapper[4685]: I0321 04:01:29.372668 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3be15e47-4dd7-4e37-81ac-a3e7ec139af1-webhook-cert\") pod \"mariadb-operator-controller-manager-6697764dc7-vzzbc\" (UID: \"3be15e47-4dd7-4e37-81ac-a3e7ec139af1\") " pod="openstack-operators/mariadb-operator-controller-manager-6697764dc7-vzzbc" Mar 21 04:01:29 crc kubenswrapper[4685]: I0321 04:01:29.372734 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzfsx\" (UniqueName: \"kubernetes.io/projected/3be15e47-4dd7-4e37-81ac-a3e7ec139af1-kube-api-access-dzfsx\") pod \"mariadb-operator-controller-manager-6697764dc7-vzzbc\" (UID: \"3be15e47-4dd7-4e37-81ac-a3e7ec139af1\") " pod="openstack-operators/mariadb-operator-controller-manager-6697764dc7-vzzbc" Mar 21 04:01:29 crc kubenswrapper[4685]: I0321 04:01:29.372797 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3be15e47-4dd7-4e37-81ac-a3e7ec139af1-apiservice-cert\") pod \"mariadb-operator-controller-manager-6697764dc7-vzzbc\" (UID: \"3be15e47-4dd7-4e37-81ac-a3e7ec139af1\") " pod="openstack-operators/mariadb-operator-controller-manager-6697764dc7-vzzbc" Mar 21 04:01:29 crc kubenswrapper[4685]: I0321 04:01:29.474011 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3be15e47-4dd7-4e37-81ac-a3e7ec139af1-webhook-cert\") pod \"mariadb-operator-controller-manager-6697764dc7-vzzbc\" (UID: \"3be15e47-4dd7-4e37-81ac-a3e7ec139af1\") " pod="openstack-operators/mariadb-operator-controller-manager-6697764dc7-vzzbc" Mar 21 04:01:29 crc kubenswrapper[4685]: I0321 04:01:29.474065 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzfsx\" (UniqueName: \"kubernetes.io/projected/3be15e47-4dd7-4e37-81ac-a3e7ec139af1-kube-api-access-dzfsx\") pod \"mariadb-operator-controller-manager-6697764dc7-vzzbc\" (UID: \"3be15e47-4dd7-4e37-81ac-a3e7ec139af1\") " pod="openstack-operators/mariadb-operator-controller-manager-6697764dc7-vzzbc" Mar 21 04:01:29 crc kubenswrapper[4685]: I0321 04:01:29.474113 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3be15e47-4dd7-4e37-81ac-a3e7ec139af1-apiservice-cert\") pod \"mariadb-operator-controller-manager-6697764dc7-vzzbc\" (UID: \"3be15e47-4dd7-4e37-81ac-a3e7ec139af1\") " pod="openstack-operators/mariadb-operator-controller-manager-6697764dc7-vzzbc" Mar 21 04:01:29 crc kubenswrapper[4685]: I0321 04:01:29.481786 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3be15e47-4dd7-4e37-81ac-a3e7ec139af1-apiservice-cert\") pod \"mariadb-operator-controller-manager-6697764dc7-vzzbc\" (UID: \"3be15e47-4dd7-4e37-81ac-a3e7ec139af1\") " pod="openstack-operators/mariadb-operator-controller-manager-6697764dc7-vzzbc" Mar 21 04:01:29 crc kubenswrapper[4685]: I0321 04:01:29.482952 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3be15e47-4dd7-4e37-81ac-a3e7ec139af1-webhook-cert\") pod \"mariadb-operator-controller-manager-6697764dc7-vzzbc\" (UID: \"3be15e47-4dd7-4e37-81ac-a3e7ec139af1\") " pod="openstack-operators/mariadb-operator-controller-manager-6697764dc7-vzzbc" Mar 21 04:01:29 crc kubenswrapper[4685]: I0321 04:01:29.490283 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzfsx\" (UniqueName: \"kubernetes.io/projected/3be15e47-4dd7-4e37-81ac-a3e7ec139af1-kube-api-access-dzfsx\") pod \"mariadb-operator-controller-manager-6697764dc7-vzzbc\" (UID: \"3be15e47-4dd7-4e37-81ac-a3e7ec139af1\") " pod="openstack-operators/mariadb-operator-controller-manager-6697764dc7-vzzbc" Mar 21 04:01:29 crc kubenswrapper[4685]: I0321 04:01:29.599622 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6697764dc7-vzzbc" Mar 21 04:01:30 crc kubenswrapper[4685]: I0321 04:01:30.041169 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6697764dc7-vzzbc"] Mar 21 04:01:30 crc kubenswrapper[4685]: I0321 04:01:30.564963 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6697764dc7-vzzbc" event={"ID":"3be15e47-4dd7-4e37-81ac-a3e7ec139af1","Type":"ContainerStarted","Data":"e68f673c02549104aea351f777a21b53b632feb87fbd529fac341c6f4798ec62"} Mar 21 04:01:33 crc kubenswrapper[4685]: I0321 04:01:33.587949 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6697764dc7-vzzbc" event={"ID":"3be15e47-4dd7-4e37-81ac-a3e7ec139af1","Type":"ContainerStarted","Data":"b5c9b4de08deb332e3fc18bfe577959d3b1748a282928fd0e7147671831d4c01"} Mar 21 04:01:33 crc kubenswrapper[4685]: I0321 04:01:33.588521 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-6697764dc7-vzzbc" Mar 21 04:01:39 crc kubenswrapper[4685]: I0321 04:01:39.606070 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-6697764dc7-vzzbc" Mar 21 04:01:39 crc kubenswrapper[4685]: I0321 04:01:39.633287 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-6697764dc7-vzzbc" podStartSLOduration=7.673898775 podStartE2EDuration="10.633271063s" podCreationTimestamp="2026-03-21 04:01:29 +0000 UTC" firstStartedPulling="2026-03-21 04:01:30.052173472 +0000 UTC m=+922.529242274" lastFinishedPulling="2026-03-21 04:01:33.01154577 +0000 UTC m=+925.488614562" observedRunningTime="2026-03-21 04:01:33.605684711 +0000 UTC m=+926.082753553" watchObservedRunningTime="2026-03-21 04:01:39.633271063 +0000 UTC m=+932.110339875" Mar 21 04:01:41 crc kubenswrapper[4685]: I0321 04:01:41.639542 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-index-8t44g"] Mar 21 04:01:41 crc kubenswrapper[4685]: I0321 04:01:41.640444 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-8t44g" Mar 21 04:01:41 crc kubenswrapper[4685]: I0321 04:01:41.643369 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-index-dockercfg-444l2" Mar 21 04:01:41 crc kubenswrapper[4685]: I0321 04:01:41.649953 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-8t44g"] Mar 21 04:01:41 crc kubenswrapper[4685]: I0321 04:01:41.725605 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zh8nw\" (UniqueName: \"kubernetes.io/projected/1259cc78-b010-41b9-a2cb-97b75cf30c74-kube-api-access-zh8nw\") pod \"infra-operator-index-8t44g\" (UID: \"1259cc78-b010-41b9-a2cb-97b75cf30c74\") " pod="openstack-operators/infra-operator-index-8t44g" Mar 21 04:01:41 crc kubenswrapper[4685]: I0321 04:01:41.826921 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zh8nw\" (UniqueName: \"kubernetes.io/projected/1259cc78-b010-41b9-a2cb-97b75cf30c74-kube-api-access-zh8nw\") pod \"infra-operator-index-8t44g\" (UID: \"1259cc78-b010-41b9-a2cb-97b75cf30c74\") " pod="openstack-operators/infra-operator-index-8t44g" Mar 21 04:01:41 crc kubenswrapper[4685]: I0321 04:01:41.843826 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zh8nw\" (UniqueName: \"kubernetes.io/projected/1259cc78-b010-41b9-a2cb-97b75cf30c74-kube-api-access-zh8nw\") pod \"infra-operator-index-8t44g\" (UID: \"1259cc78-b010-41b9-a2cb-97b75cf30c74\") " pod="openstack-operators/infra-operator-index-8t44g" Mar 21 04:01:41 crc kubenswrapper[4685]: I0321 04:01:41.963692 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-8t44g" Mar 21 04:01:42 crc kubenswrapper[4685]: I0321 04:01:42.155680 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-8t44g"] Mar 21 04:01:42 crc kubenswrapper[4685]: I0321 04:01:42.642129 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-8t44g" event={"ID":"1259cc78-b010-41b9-a2cb-97b75cf30c74","Type":"ContainerStarted","Data":"4daf27bc4a0441a538148bf21aae6631897d7f297b13a10b6ecc2507f9712217"} Mar 21 04:01:43 crc kubenswrapper[4685]: I0321 04:01:43.651361 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-8t44g" event={"ID":"1259cc78-b010-41b9-a2cb-97b75cf30c74","Type":"ContainerStarted","Data":"979bdce4aa64844ec0f9b63f8351c361c9f8ec925d847befdb5ce78dcd35579f"} Mar 21 04:01:43 crc kubenswrapper[4685]: I0321 04:01:43.672536 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-index-8t44g" podStartSLOduration=1.9450261690000001 podStartE2EDuration="2.672519618s" podCreationTimestamp="2026-03-21 04:01:41 +0000 UTC" firstStartedPulling="2026-03-21 04:01:42.162472994 +0000 UTC m=+934.639541786" lastFinishedPulling="2026-03-21 04:01:42.889966433 +0000 UTC m=+935.367035235" observedRunningTime="2026-03-21 04:01:43.671396746 +0000 UTC m=+936.148465578" watchObservedRunningTime="2026-03-21 04:01:43.672519618 +0000 UTC m=+936.149588410" Mar 21 04:01:44 crc kubenswrapper[4685]: I0321 04:01:44.837418 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-index-8t44g"] Mar 21 04:01:45 crc kubenswrapper[4685]: I0321 04:01:45.445491 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-index-2d98g"] Mar 21 04:01:45 crc kubenswrapper[4685]: I0321 04:01:45.446152 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-2d98g" Mar 21 04:01:45 crc kubenswrapper[4685]: I0321 04:01:45.458939 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-2d98g"] Mar 21 04:01:45 crc kubenswrapper[4685]: I0321 04:01:45.573446 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5hv7\" (UniqueName: \"kubernetes.io/projected/2599b123-88b7-41bb-981a-ce52020584c9-kube-api-access-k5hv7\") pod \"infra-operator-index-2d98g\" (UID: \"2599b123-88b7-41bb-981a-ce52020584c9\") " pod="openstack-operators/infra-operator-index-2d98g" Mar 21 04:01:45 crc kubenswrapper[4685]: I0321 04:01:45.662812 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/infra-operator-index-8t44g" podUID="1259cc78-b010-41b9-a2cb-97b75cf30c74" containerName="registry-server" containerID="cri-o://979bdce4aa64844ec0f9b63f8351c361c9f8ec925d847befdb5ce78dcd35579f" gracePeriod=2 Mar 21 04:01:45 crc kubenswrapper[4685]: I0321 04:01:45.674469 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5hv7\" (UniqueName: \"kubernetes.io/projected/2599b123-88b7-41bb-981a-ce52020584c9-kube-api-access-k5hv7\") pod \"infra-operator-index-2d98g\" (UID: \"2599b123-88b7-41bb-981a-ce52020584c9\") " pod="openstack-operators/infra-operator-index-2d98g" Mar 21 04:01:45 crc kubenswrapper[4685]: I0321 04:01:45.697040 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5hv7\" (UniqueName: \"kubernetes.io/projected/2599b123-88b7-41bb-981a-ce52020584c9-kube-api-access-k5hv7\") pod \"infra-operator-index-2d98g\" (UID: \"2599b123-88b7-41bb-981a-ce52020584c9\") " pod="openstack-operators/infra-operator-index-2d98g" Mar 21 04:01:45 crc kubenswrapper[4685]: I0321 04:01:45.764881 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-2d98g" Mar 21 04:01:46 crc kubenswrapper[4685]: I0321 04:01:46.004105 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-2d98g"] Mar 21 04:01:46 crc kubenswrapper[4685]: W0321 04:01:46.016058 4685 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2599b123_88b7_41bb_981a_ce52020584c9.slice/crio-a54a1f8b028cfd26b8ffd3d9416b4bcdd7fbca4ee11acd3f62d66c953e6970ab WatchSource:0}: Error finding container a54a1f8b028cfd26b8ffd3d9416b4bcdd7fbca4ee11acd3f62d66c953e6970ab: Status 404 returned error can't find the container with id a54a1f8b028cfd26b8ffd3d9416b4bcdd7fbca4ee11acd3f62d66c953e6970ab Mar 21 04:01:46 crc kubenswrapper[4685]: I0321 04:01:46.052452 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-8t44g" Mar 21 04:01:46 crc kubenswrapper[4685]: I0321 04:01:46.188561 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zh8nw\" (UniqueName: \"kubernetes.io/projected/1259cc78-b010-41b9-a2cb-97b75cf30c74-kube-api-access-zh8nw\") pod \"1259cc78-b010-41b9-a2cb-97b75cf30c74\" (UID: \"1259cc78-b010-41b9-a2cb-97b75cf30c74\") " Mar 21 04:01:46 crc kubenswrapper[4685]: I0321 04:01:46.196281 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1259cc78-b010-41b9-a2cb-97b75cf30c74-kube-api-access-zh8nw" (OuterVolumeSpecName: "kube-api-access-zh8nw") pod "1259cc78-b010-41b9-a2cb-97b75cf30c74" (UID: "1259cc78-b010-41b9-a2cb-97b75cf30c74"). InnerVolumeSpecName "kube-api-access-zh8nw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:01:46 crc kubenswrapper[4685]: I0321 04:01:46.290541 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zh8nw\" (UniqueName: \"kubernetes.io/projected/1259cc78-b010-41b9-a2cb-97b75cf30c74-kube-api-access-zh8nw\") on node \"crc\" DevicePath \"\"" Mar 21 04:01:46 crc kubenswrapper[4685]: I0321 04:01:46.669582 4685 generic.go:334] "Generic (PLEG): container finished" podID="1259cc78-b010-41b9-a2cb-97b75cf30c74" containerID="979bdce4aa64844ec0f9b63f8351c361c9f8ec925d847befdb5ce78dcd35579f" exitCode=0 Mar 21 04:01:46 crc kubenswrapper[4685]: I0321 04:01:46.669748 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-8t44g" event={"ID":"1259cc78-b010-41b9-a2cb-97b75cf30c74","Type":"ContainerDied","Data":"979bdce4aa64844ec0f9b63f8351c361c9f8ec925d847befdb5ce78dcd35579f"} Mar 21 04:01:46 crc kubenswrapper[4685]: I0321 04:01:46.669891 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-8t44g" event={"ID":"1259cc78-b010-41b9-a2cb-97b75cf30c74","Type":"ContainerDied","Data":"4daf27bc4a0441a538148bf21aae6631897d7f297b13a10b6ecc2507f9712217"} Mar 21 04:01:46 crc kubenswrapper[4685]: I0321 04:01:46.669914 4685 scope.go:117] "RemoveContainer" containerID="979bdce4aa64844ec0f9b63f8351c361c9f8ec925d847befdb5ce78dcd35579f" Mar 21 04:01:46 crc kubenswrapper[4685]: I0321 04:01:46.670211 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-8t44g" Mar 21 04:01:46 crc kubenswrapper[4685]: I0321 04:01:46.672102 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-2d98g" event={"ID":"2599b123-88b7-41bb-981a-ce52020584c9","Type":"ContainerStarted","Data":"cbfb74167f21b03e55f48da2e64caaef2a70c86f9394cda21d8a5d2101b088dc"} Mar 21 04:01:46 crc kubenswrapper[4685]: I0321 04:01:46.672125 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-2d98g" event={"ID":"2599b123-88b7-41bb-981a-ce52020584c9","Type":"ContainerStarted","Data":"a54a1f8b028cfd26b8ffd3d9416b4bcdd7fbca4ee11acd3f62d66c953e6970ab"} Mar 21 04:01:46 crc kubenswrapper[4685]: I0321 04:01:46.683769 4685 scope.go:117] "RemoveContainer" containerID="979bdce4aa64844ec0f9b63f8351c361c9f8ec925d847befdb5ce78dcd35579f" Mar 21 04:01:46 crc kubenswrapper[4685]: E0321 04:01:46.684075 4685 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"979bdce4aa64844ec0f9b63f8351c361c9f8ec925d847befdb5ce78dcd35579f\": container with ID starting with 979bdce4aa64844ec0f9b63f8351c361c9f8ec925d847befdb5ce78dcd35579f not found: ID does not exist" containerID="979bdce4aa64844ec0f9b63f8351c361c9f8ec925d847befdb5ce78dcd35579f" Mar 21 04:01:46 crc kubenswrapper[4685]: I0321 04:01:46.684109 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"979bdce4aa64844ec0f9b63f8351c361c9f8ec925d847befdb5ce78dcd35579f"} err="failed to get container status \"979bdce4aa64844ec0f9b63f8351c361c9f8ec925d847befdb5ce78dcd35579f\": rpc error: code = NotFound desc = could not find container \"979bdce4aa64844ec0f9b63f8351c361c9f8ec925d847befdb5ce78dcd35579f\": container with ID starting with 979bdce4aa64844ec0f9b63f8351c361c9f8ec925d847befdb5ce78dcd35579f not found: ID does not exist" Mar 21 04:01:46 crc kubenswrapper[4685]: I0321 04:01:46.688371 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-index-2d98g" podStartSLOduration=1.277855788 podStartE2EDuration="1.688360984s" podCreationTimestamp="2026-03-21 04:01:45 +0000 UTC" firstStartedPulling="2026-03-21 04:01:46.020032921 +0000 UTC m=+938.497101723" lastFinishedPulling="2026-03-21 04:01:46.430538127 +0000 UTC m=+938.907606919" observedRunningTime="2026-03-21 04:01:46.687311834 +0000 UTC m=+939.164380646" watchObservedRunningTime="2026-03-21 04:01:46.688360984 +0000 UTC m=+939.165429776" Mar 21 04:01:46 crc kubenswrapper[4685]: I0321 04:01:46.705407 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-index-8t44g"] Mar 21 04:01:46 crc kubenswrapper[4685]: I0321 04:01:46.705965 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/infra-operator-index-8t44g"] Mar 21 04:01:48 crc kubenswrapper[4685]: I0321 04:01:48.312470 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1259cc78-b010-41b9-a2cb-97b75cf30c74" path="/var/lib/kubelet/pods/1259cc78-b010-41b9-a2cb-97b75cf30c74/volumes" Mar 21 04:01:55 crc kubenswrapper[4685]: I0321 04:01:55.765361 4685 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/infra-operator-index-2d98g" Mar 21 04:01:55 crc kubenswrapper[4685]: I0321 04:01:55.765735 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-index-2d98g" Mar 21 04:01:55 crc kubenswrapper[4685]: I0321 04:01:55.797897 4685 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/infra-operator-index-2d98g" Mar 21 04:01:56 crc kubenswrapper[4685]: I0321 04:01:56.759742 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-index-2d98g" Mar 21 04:02:00 crc kubenswrapper[4685]: I0321 04:02:00.153775 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567762-7g776"] Mar 21 04:02:00 crc kubenswrapper[4685]: E0321 04:02:00.154524 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1259cc78-b010-41b9-a2cb-97b75cf30c74" containerName="registry-server" Mar 21 04:02:00 crc kubenswrapper[4685]: I0321 04:02:00.154545 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="1259cc78-b010-41b9-a2cb-97b75cf30c74" containerName="registry-server" Mar 21 04:02:00 crc kubenswrapper[4685]: I0321 04:02:00.154702 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="1259cc78-b010-41b9-a2cb-97b75cf30c74" containerName="registry-server" Mar 21 04:02:00 crc kubenswrapper[4685]: I0321 04:02:00.155309 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567762-7g776" Mar 21 04:02:00 crc kubenswrapper[4685]: I0321 04:02:00.160061 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 04:02:00 crc kubenswrapper[4685]: I0321 04:02:00.160471 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567762-7g776"] Mar 21 04:02:00 crc kubenswrapper[4685]: I0321 04:02:00.160969 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mpzj\" (UniqueName: \"kubernetes.io/projected/e1cc3737-5c39-4975-af9a-f403efa1e0b7-kube-api-access-2mpzj\") pod \"auto-csr-approver-29567762-7g776\" (UID: \"e1cc3737-5c39-4975-af9a-f403efa1e0b7\") " pod="openshift-infra/auto-csr-approver-29567762-7g776" Mar 21 04:02:00 crc kubenswrapper[4685]: I0321 04:02:00.162421 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k75cc" Mar 21 04:02:00 crc kubenswrapper[4685]: I0321 04:02:00.162756 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 04:02:00 crc kubenswrapper[4685]: I0321 04:02:00.262062 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mpzj\" (UniqueName: \"kubernetes.io/projected/e1cc3737-5c39-4975-af9a-f403efa1e0b7-kube-api-access-2mpzj\") pod \"auto-csr-approver-29567762-7g776\" (UID: \"e1cc3737-5c39-4975-af9a-f403efa1e0b7\") " pod="openshift-infra/auto-csr-approver-29567762-7g776" Mar 21 04:02:00 crc kubenswrapper[4685]: I0321 04:02:00.289123 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mpzj\" (UniqueName: \"kubernetes.io/projected/e1cc3737-5c39-4975-af9a-f403efa1e0b7-kube-api-access-2mpzj\") pod \"auto-csr-approver-29567762-7g776\" (UID: \"e1cc3737-5c39-4975-af9a-f403efa1e0b7\") " pod="openshift-infra/auto-csr-approver-29567762-7g776" Mar 21 04:02:00 crc kubenswrapper[4685]: I0321 04:02:00.550035 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567762-7g776" Mar 21 04:02:00 crc kubenswrapper[4685]: I0321 04:02:00.933050 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567762-7g776"] Mar 21 04:02:01 crc kubenswrapper[4685]: I0321 04:02:01.767044 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567762-7g776" event={"ID":"e1cc3737-5c39-4975-af9a-f403efa1e0b7","Type":"ContainerStarted","Data":"219cf88d00f1adade94108319686f5781c7f8794473309c0f64fdc29f24b583a"} Mar 21 04:02:02 crc kubenswrapper[4685]: I0321 04:02:02.775349 4685 generic.go:334] "Generic (PLEG): container finished" podID="e1cc3737-5c39-4975-af9a-f403efa1e0b7" containerID="292f6eff2ae40856013b4696f5aebcf7f360d67a6d20fb023657731f90209e5b" exitCode=0 Mar 21 04:02:02 crc kubenswrapper[4685]: I0321 04:02:02.775464 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567762-7g776" event={"ID":"e1cc3737-5c39-4975-af9a-f403efa1e0b7","Type":"ContainerDied","Data":"292f6eff2ae40856013b4696f5aebcf7f360d67a6d20fb023657731f90209e5b"} Mar 21 04:02:03 crc kubenswrapper[4685]: I0321 04:02:03.108891 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/014ac6de79e0394b0d824e62fe3d0678564789565fd857972d41b2afd4f9qmw"] Mar 21 04:02:03 crc kubenswrapper[4685]: I0321 04:02:03.114661 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/014ac6de79e0394b0d824e62fe3d0678564789565fd857972d41b2afd4f9qmw" Mar 21 04:02:03 crc kubenswrapper[4685]: I0321 04:02:03.118730 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-2vwsz" Mar 21 04:02:03 crc kubenswrapper[4685]: I0321 04:02:03.150182 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/014ac6de79e0394b0d824e62fe3d0678564789565fd857972d41b2afd4f9qmw"] Mar 21 04:02:03 crc kubenswrapper[4685]: I0321 04:02:03.199477 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e6f4b7f3-a938-4717-a6aa-cc9b620738b0-util\") pod \"014ac6de79e0394b0d824e62fe3d0678564789565fd857972d41b2afd4f9qmw\" (UID: \"e6f4b7f3-a938-4717-a6aa-cc9b620738b0\") " pod="openstack-operators/014ac6de79e0394b0d824e62fe3d0678564789565fd857972d41b2afd4f9qmw" Mar 21 04:02:03 crc kubenswrapper[4685]: I0321 04:02:03.199547 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2z94\" (UniqueName: \"kubernetes.io/projected/e6f4b7f3-a938-4717-a6aa-cc9b620738b0-kube-api-access-x2z94\") pod \"014ac6de79e0394b0d824e62fe3d0678564789565fd857972d41b2afd4f9qmw\" (UID: \"e6f4b7f3-a938-4717-a6aa-cc9b620738b0\") " pod="openstack-operators/014ac6de79e0394b0d824e62fe3d0678564789565fd857972d41b2afd4f9qmw" Mar 21 04:02:03 crc kubenswrapper[4685]: I0321 04:02:03.199581 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e6f4b7f3-a938-4717-a6aa-cc9b620738b0-bundle\") pod \"014ac6de79e0394b0d824e62fe3d0678564789565fd857972d41b2afd4f9qmw\" (UID: \"e6f4b7f3-a938-4717-a6aa-cc9b620738b0\") " pod="openstack-operators/014ac6de79e0394b0d824e62fe3d0678564789565fd857972d41b2afd4f9qmw" Mar 21 04:02:03 crc kubenswrapper[4685]: I0321 04:02:03.300201 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e6f4b7f3-a938-4717-a6aa-cc9b620738b0-bundle\") pod \"014ac6de79e0394b0d824e62fe3d0678564789565fd857972d41b2afd4f9qmw\" (UID: \"e6f4b7f3-a938-4717-a6aa-cc9b620738b0\") " pod="openstack-operators/014ac6de79e0394b0d824e62fe3d0678564789565fd857972d41b2afd4f9qmw" Mar 21 04:02:03 crc kubenswrapper[4685]: I0321 04:02:03.300304 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e6f4b7f3-a938-4717-a6aa-cc9b620738b0-util\") pod \"014ac6de79e0394b0d824e62fe3d0678564789565fd857972d41b2afd4f9qmw\" (UID: \"e6f4b7f3-a938-4717-a6aa-cc9b620738b0\") " pod="openstack-operators/014ac6de79e0394b0d824e62fe3d0678564789565fd857972d41b2afd4f9qmw" Mar 21 04:02:03 crc kubenswrapper[4685]: I0321 04:02:03.300398 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2z94\" (UniqueName: \"kubernetes.io/projected/e6f4b7f3-a938-4717-a6aa-cc9b620738b0-kube-api-access-x2z94\") pod \"014ac6de79e0394b0d824e62fe3d0678564789565fd857972d41b2afd4f9qmw\" (UID: \"e6f4b7f3-a938-4717-a6aa-cc9b620738b0\") " pod="openstack-operators/014ac6de79e0394b0d824e62fe3d0678564789565fd857972d41b2afd4f9qmw" Mar 21 04:02:03 crc kubenswrapper[4685]: I0321 04:02:03.300770 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e6f4b7f3-a938-4717-a6aa-cc9b620738b0-bundle\") pod \"014ac6de79e0394b0d824e62fe3d0678564789565fd857972d41b2afd4f9qmw\" (UID: \"e6f4b7f3-a938-4717-a6aa-cc9b620738b0\") " pod="openstack-operators/014ac6de79e0394b0d824e62fe3d0678564789565fd857972d41b2afd4f9qmw" Mar 21 04:02:03 crc kubenswrapper[4685]: I0321 04:02:03.301032 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e6f4b7f3-a938-4717-a6aa-cc9b620738b0-util\") pod \"014ac6de79e0394b0d824e62fe3d0678564789565fd857972d41b2afd4f9qmw\" (UID: \"e6f4b7f3-a938-4717-a6aa-cc9b620738b0\") " pod="openstack-operators/014ac6de79e0394b0d824e62fe3d0678564789565fd857972d41b2afd4f9qmw" Mar 21 04:02:03 crc kubenswrapper[4685]: I0321 04:02:03.319189 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2z94\" (UniqueName: \"kubernetes.io/projected/e6f4b7f3-a938-4717-a6aa-cc9b620738b0-kube-api-access-x2z94\") pod \"014ac6de79e0394b0d824e62fe3d0678564789565fd857972d41b2afd4f9qmw\" (UID: \"e6f4b7f3-a938-4717-a6aa-cc9b620738b0\") " pod="openstack-operators/014ac6de79e0394b0d824e62fe3d0678564789565fd857972d41b2afd4f9qmw" Mar 21 04:02:03 crc kubenswrapper[4685]: I0321 04:02:03.466818 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/014ac6de79e0394b0d824e62fe3d0678564789565fd857972d41b2afd4f9qmw" Mar 21 04:02:03 crc kubenswrapper[4685]: I0321 04:02:03.897978 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/014ac6de79e0394b0d824e62fe3d0678564789565fd857972d41b2afd4f9qmw"] Mar 21 04:02:03 crc kubenswrapper[4685]: I0321 04:02:03.999872 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567762-7g776" Mar 21 04:02:04 crc kubenswrapper[4685]: I0321 04:02:04.108977 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mpzj\" (UniqueName: \"kubernetes.io/projected/e1cc3737-5c39-4975-af9a-f403efa1e0b7-kube-api-access-2mpzj\") pod \"e1cc3737-5c39-4975-af9a-f403efa1e0b7\" (UID: \"e1cc3737-5c39-4975-af9a-f403efa1e0b7\") " Mar 21 04:02:04 crc kubenswrapper[4685]: I0321 04:02:04.114230 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1cc3737-5c39-4975-af9a-f403efa1e0b7-kube-api-access-2mpzj" (OuterVolumeSpecName: "kube-api-access-2mpzj") pod "e1cc3737-5c39-4975-af9a-f403efa1e0b7" (UID: "e1cc3737-5c39-4975-af9a-f403efa1e0b7"). InnerVolumeSpecName "kube-api-access-2mpzj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:02:04 crc kubenswrapper[4685]: I0321 04:02:04.210969 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2mpzj\" (UniqueName: \"kubernetes.io/projected/e1cc3737-5c39-4975-af9a-f403efa1e0b7-kube-api-access-2mpzj\") on node \"crc\" DevicePath \"\"" Mar 21 04:02:04 crc kubenswrapper[4685]: I0321 04:02:04.791070 4685 generic.go:334] "Generic (PLEG): container finished" podID="e6f4b7f3-a938-4717-a6aa-cc9b620738b0" containerID="e1a2e139b19c8746eee0fb0420f2322d3ae267c77164a3e29764c388194f4305" exitCode=0 Mar 21 04:02:04 crc kubenswrapper[4685]: I0321 04:02:04.791152 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/014ac6de79e0394b0d824e62fe3d0678564789565fd857972d41b2afd4f9qmw" event={"ID":"e6f4b7f3-a938-4717-a6aa-cc9b620738b0","Type":"ContainerDied","Data":"e1a2e139b19c8746eee0fb0420f2322d3ae267c77164a3e29764c388194f4305"} Mar 21 04:02:04 crc kubenswrapper[4685]: I0321 04:02:04.791182 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/014ac6de79e0394b0d824e62fe3d0678564789565fd857972d41b2afd4f9qmw" event={"ID":"e6f4b7f3-a938-4717-a6aa-cc9b620738b0","Type":"ContainerStarted","Data":"c6a426e9aa5bc3ba17af45ded9a1bacf406094284e17b712840eb700df1a8f85"} Mar 21 04:02:04 crc kubenswrapper[4685]: I0321 04:02:04.792721 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567762-7g776" Mar 21 04:02:04 crc kubenswrapper[4685]: I0321 04:02:04.792753 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567762-7g776" event={"ID":"e1cc3737-5c39-4975-af9a-f403efa1e0b7","Type":"ContainerDied","Data":"219cf88d00f1adade94108319686f5781c7f8794473309c0f64fdc29f24b583a"} Mar 21 04:02:04 crc kubenswrapper[4685]: I0321 04:02:04.792782 4685 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="219cf88d00f1adade94108319686f5781c7f8794473309c0f64fdc29f24b583a" Mar 21 04:02:05 crc kubenswrapper[4685]: I0321 04:02:05.063951 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567756-c8grb"] Mar 21 04:02:05 crc kubenswrapper[4685]: I0321 04:02:05.069371 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567756-c8grb"] Mar 21 04:02:05 crc kubenswrapper[4685]: I0321 04:02:05.798803 4685 generic.go:334] "Generic (PLEG): container finished" podID="e6f4b7f3-a938-4717-a6aa-cc9b620738b0" containerID="976b2eff86b07bc03d8c0610b631694b641b44bb86692d56bf5953f1c5ee2337" exitCode=0 Mar 21 04:02:05 crc kubenswrapper[4685]: I0321 04:02:05.798865 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/014ac6de79e0394b0d824e62fe3d0678564789565fd857972d41b2afd4f9qmw" event={"ID":"e6f4b7f3-a938-4717-a6aa-cc9b620738b0","Type":"ContainerDied","Data":"976b2eff86b07bc03d8c0610b631694b641b44bb86692d56bf5953f1c5ee2337"} Mar 21 04:02:06 crc kubenswrapper[4685]: I0321 04:02:06.312171 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4329ac2-6343-445e-95a6-09a4c21aeef4" path="/var/lib/kubelet/pods/a4329ac2-6343-445e-95a6-09a4c21aeef4/volumes" Mar 21 04:02:06 crc kubenswrapper[4685]: I0321 04:02:06.813069 4685 generic.go:334] "Generic (PLEG): container finished" podID="e6f4b7f3-a938-4717-a6aa-cc9b620738b0" containerID="7acbb1263af799a95715386bddb8f2b759c9c96f371dd50aa9df8942e78be9b4" exitCode=0 Mar 21 04:02:06 crc kubenswrapper[4685]: I0321 04:02:06.813243 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/014ac6de79e0394b0d824e62fe3d0678564789565fd857972d41b2afd4f9qmw" event={"ID":"e6f4b7f3-a938-4717-a6aa-cc9b620738b0","Type":"ContainerDied","Data":"7acbb1263af799a95715386bddb8f2b759c9c96f371dd50aa9df8942e78be9b4"} Mar 21 04:02:08 crc kubenswrapper[4685]: I0321 04:02:08.141352 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/014ac6de79e0394b0d824e62fe3d0678564789565fd857972d41b2afd4f9qmw" Mar 21 04:02:08 crc kubenswrapper[4685]: I0321 04:02:08.265283 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e6f4b7f3-a938-4717-a6aa-cc9b620738b0-util\") pod \"e6f4b7f3-a938-4717-a6aa-cc9b620738b0\" (UID: \"e6f4b7f3-a938-4717-a6aa-cc9b620738b0\") " Mar 21 04:02:08 crc kubenswrapper[4685]: I0321 04:02:08.265417 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e6f4b7f3-a938-4717-a6aa-cc9b620738b0-bundle\") pod \"e6f4b7f3-a938-4717-a6aa-cc9b620738b0\" (UID: \"e6f4b7f3-a938-4717-a6aa-cc9b620738b0\") " Mar 21 04:02:08 crc kubenswrapper[4685]: I0321 04:02:08.265469 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2z94\" (UniqueName: \"kubernetes.io/projected/e6f4b7f3-a938-4717-a6aa-cc9b620738b0-kube-api-access-x2z94\") pod \"e6f4b7f3-a938-4717-a6aa-cc9b620738b0\" (UID: \"e6f4b7f3-a938-4717-a6aa-cc9b620738b0\") " Mar 21 04:02:08 crc kubenswrapper[4685]: I0321 04:02:08.267569 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6f4b7f3-a938-4717-a6aa-cc9b620738b0-bundle" (OuterVolumeSpecName: "bundle") pod "e6f4b7f3-a938-4717-a6aa-cc9b620738b0" (UID: "e6f4b7f3-a938-4717-a6aa-cc9b620738b0"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:02:08 crc kubenswrapper[4685]: I0321 04:02:08.270086 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6f4b7f3-a938-4717-a6aa-cc9b620738b0-kube-api-access-x2z94" (OuterVolumeSpecName: "kube-api-access-x2z94") pod "e6f4b7f3-a938-4717-a6aa-cc9b620738b0" (UID: "e6f4b7f3-a938-4717-a6aa-cc9b620738b0"). InnerVolumeSpecName "kube-api-access-x2z94". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:02:08 crc kubenswrapper[4685]: I0321 04:02:08.280552 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6f4b7f3-a938-4717-a6aa-cc9b620738b0-util" (OuterVolumeSpecName: "util") pod "e6f4b7f3-a938-4717-a6aa-cc9b620738b0" (UID: "e6f4b7f3-a938-4717-a6aa-cc9b620738b0"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:02:08 crc kubenswrapper[4685]: I0321 04:02:08.366709 4685 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e6f4b7f3-a938-4717-a6aa-cc9b620738b0-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:02:08 crc kubenswrapper[4685]: I0321 04:02:08.366740 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2z94\" (UniqueName: \"kubernetes.io/projected/e6f4b7f3-a938-4717-a6aa-cc9b620738b0-kube-api-access-x2z94\") on node \"crc\" DevicePath \"\"" Mar 21 04:02:08 crc kubenswrapper[4685]: I0321 04:02:08.366752 4685 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e6f4b7f3-a938-4717-a6aa-cc9b620738b0-util\") on node \"crc\" DevicePath \"\"" Mar 21 04:02:08 crc kubenswrapper[4685]: I0321 04:02:08.831635 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/014ac6de79e0394b0d824e62fe3d0678564789565fd857972d41b2afd4f9qmw" event={"ID":"e6f4b7f3-a938-4717-a6aa-cc9b620738b0","Type":"ContainerDied","Data":"c6a426e9aa5bc3ba17af45ded9a1bacf406094284e17b712840eb700df1a8f85"} Mar 21 04:02:08 crc kubenswrapper[4685]: I0321 04:02:08.831685 4685 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c6a426e9aa5bc3ba17af45ded9a1bacf406094284e17b712840eb700df1a8f85" Mar 21 04:02:08 crc kubenswrapper[4685]: I0321 04:02:08.831752 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/014ac6de79e0394b0d824e62fe3d0678564789565fd857972d41b2afd4f9qmw" Mar 21 04:02:09 crc kubenswrapper[4685]: I0321 04:02:09.685115 4685 patch_prober.go:28] interesting pod/machine-config-daemon-7r9cg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 04:02:09 crc kubenswrapper[4685]: I0321 04:02:09.685190 4685 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" podUID="cea46fe2-4e41-43ab-a069-cb30fb4e732c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 04:02:20 crc kubenswrapper[4685]: I0321 04:02:20.666879 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-6c46fc7bc5-hpt7d"] Mar 21 04:02:20 crc kubenswrapper[4685]: E0321 04:02:20.667782 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6f4b7f3-a938-4717-a6aa-cc9b620738b0" containerName="pull" Mar 21 04:02:20 crc kubenswrapper[4685]: I0321 04:02:20.667796 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6f4b7f3-a938-4717-a6aa-cc9b620738b0" containerName="pull" Mar 21 04:02:20 crc kubenswrapper[4685]: E0321 04:02:20.667812 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6f4b7f3-a938-4717-a6aa-cc9b620738b0" containerName="extract" Mar 21 04:02:20 crc kubenswrapper[4685]: I0321 04:02:20.667818 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6f4b7f3-a938-4717-a6aa-cc9b620738b0" containerName="extract" Mar 21 04:02:20 crc kubenswrapper[4685]: E0321 04:02:20.667858 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1cc3737-5c39-4975-af9a-f403efa1e0b7" containerName="oc" Mar 21 04:02:20 crc kubenswrapper[4685]: I0321 04:02:20.667866 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1cc3737-5c39-4975-af9a-f403efa1e0b7" containerName="oc" Mar 21 04:02:20 crc kubenswrapper[4685]: E0321 04:02:20.667880 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6f4b7f3-a938-4717-a6aa-cc9b620738b0" containerName="util" Mar 21 04:02:20 crc kubenswrapper[4685]: I0321 04:02:20.667886 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6f4b7f3-a938-4717-a6aa-cc9b620738b0" containerName="util" Mar 21 04:02:20 crc kubenswrapper[4685]: I0321 04:02:20.668098 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1cc3737-5c39-4975-af9a-f403efa1e0b7" containerName="oc" Mar 21 04:02:20 crc kubenswrapper[4685]: I0321 04:02:20.668118 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6f4b7f3-a938-4717-a6aa-cc9b620738b0" containerName="extract" Mar 21 04:02:20 crc kubenswrapper[4685]: I0321 04:02:20.668683 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-6c46fc7bc5-hpt7d" Mar 21 04:02:20 crc kubenswrapper[4685]: I0321 04:02:20.674469 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-service-cert" Mar 21 04:02:20 crc kubenswrapper[4685]: I0321 04:02:20.675184 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-bclg5" Mar 21 04:02:20 crc kubenswrapper[4685]: I0321 04:02:20.688788 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-6c46fc7bc5-hpt7d"] Mar 21 04:02:20 crc kubenswrapper[4685]: I0321 04:02:20.825184 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9hrt\" (UniqueName: \"kubernetes.io/projected/7a568963-c78a-41ee-ab5a-25f5d1eb0bb5-kube-api-access-x9hrt\") pod \"infra-operator-controller-manager-6c46fc7bc5-hpt7d\" (UID: \"7a568963-c78a-41ee-ab5a-25f5d1eb0bb5\") " pod="openstack-operators/infra-operator-controller-manager-6c46fc7bc5-hpt7d" Mar 21 04:02:20 crc kubenswrapper[4685]: I0321 04:02:20.825251 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7a568963-c78a-41ee-ab5a-25f5d1eb0bb5-apiservice-cert\") pod \"infra-operator-controller-manager-6c46fc7bc5-hpt7d\" (UID: \"7a568963-c78a-41ee-ab5a-25f5d1eb0bb5\") " pod="openstack-operators/infra-operator-controller-manager-6c46fc7bc5-hpt7d" Mar 21 04:02:20 crc kubenswrapper[4685]: I0321 04:02:20.825282 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7a568963-c78a-41ee-ab5a-25f5d1eb0bb5-webhook-cert\") pod \"infra-operator-controller-manager-6c46fc7bc5-hpt7d\" (UID: \"7a568963-c78a-41ee-ab5a-25f5d1eb0bb5\") " pod="openstack-operators/infra-operator-controller-manager-6c46fc7bc5-hpt7d" Mar 21 04:02:20 crc kubenswrapper[4685]: I0321 04:02:20.926151 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9hrt\" (UniqueName: \"kubernetes.io/projected/7a568963-c78a-41ee-ab5a-25f5d1eb0bb5-kube-api-access-x9hrt\") pod \"infra-operator-controller-manager-6c46fc7bc5-hpt7d\" (UID: \"7a568963-c78a-41ee-ab5a-25f5d1eb0bb5\") " pod="openstack-operators/infra-operator-controller-manager-6c46fc7bc5-hpt7d" Mar 21 04:02:20 crc kubenswrapper[4685]: I0321 04:02:20.926207 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7a568963-c78a-41ee-ab5a-25f5d1eb0bb5-apiservice-cert\") pod \"infra-operator-controller-manager-6c46fc7bc5-hpt7d\" (UID: \"7a568963-c78a-41ee-ab5a-25f5d1eb0bb5\") " pod="openstack-operators/infra-operator-controller-manager-6c46fc7bc5-hpt7d" Mar 21 04:02:20 crc kubenswrapper[4685]: I0321 04:02:20.926237 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7a568963-c78a-41ee-ab5a-25f5d1eb0bb5-webhook-cert\") pod \"infra-operator-controller-manager-6c46fc7bc5-hpt7d\" (UID: \"7a568963-c78a-41ee-ab5a-25f5d1eb0bb5\") " pod="openstack-operators/infra-operator-controller-manager-6c46fc7bc5-hpt7d" Mar 21 04:02:20 crc kubenswrapper[4685]: I0321 04:02:20.931677 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7a568963-c78a-41ee-ab5a-25f5d1eb0bb5-webhook-cert\") pod \"infra-operator-controller-manager-6c46fc7bc5-hpt7d\" (UID: \"7a568963-c78a-41ee-ab5a-25f5d1eb0bb5\") " pod="openstack-operators/infra-operator-controller-manager-6c46fc7bc5-hpt7d" Mar 21 04:02:20 crc kubenswrapper[4685]: I0321 04:02:20.932228 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7a568963-c78a-41ee-ab5a-25f5d1eb0bb5-apiservice-cert\") pod \"infra-operator-controller-manager-6c46fc7bc5-hpt7d\" (UID: \"7a568963-c78a-41ee-ab5a-25f5d1eb0bb5\") " pod="openstack-operators/infra-operator-controller-manager-6c46fc7bc5-hpt7d" Mar 21 04:02:20 crc kubenswrapper[4685]: I0321 04:02:20.950124 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9hrt\" (UniqueName: \"kubernetes.io/projected/7a568963-c78a-41ee-ab5a-25f5d1eb0bb5-kube-api-access-x9hrt\") pod \"infra-operator-controller-manager-6c46fc7bc5-hpt7d\" (UID: \"7a568963-c78a-41ee-ab5a-25f5d1eb0bb5\") " pod="openstack-operators/infra-operator-controller-manager-6c46fc7bc5-hpt7d" Mar 21 04:02:20 crc kubenswrapper[4685]: I0321 04:02:20.998700 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-6c46fc7bc5-hpt7d" Mar 21 04:02:21 crc kubenswrapper[4685]: W0321 04:02:21.441947 4685 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a568963_c78a_41ee_ab5a_25f5d1eb0bb5.slice/crio-b0755a995f7eda0c84ac26eccb0bcda2b2c98831ecc1e853b3b0e9f8ae884b66 WatchSource:0}: Error finding container b0755a995f7eda0c84ac26eccb0bcda2b2c98831ecc1e853b3b0e9f8ae884b66: Status 404 returned error can't find the container with id b0755a995f7eda0c84ac26eccb0bcda2b2c98831ecc1e853b3b0e9f8ae884b66 Mar 21 04:02:21 crc kubenswrapper[4685]: I0321 04:02:21.443910 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-6c46fc7bc5-hpt7d"] Mar 21 04:02:21 crc kubenswrapper[4685]: I0321 04:02:21.908416 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-6c46fc7bc5-hpt7d" event={"ID":"7a568963-c78a-41ee-ab5a-25f5d1eb0bb5","Type":"ContainerStarted","Data":"b0755a995f7eda0c84ac26eccb0bcda2b2c98831ecc1e853b3b0e9f8ae884b66"} Mar 21 04:02:23 crc kubenswrapper[4685]: I0321 04:02:23.919348 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-6c46fc7bc5-hpt7d" event={"ID":"7a568963-c78a-41ee-ab5a-25f5d1eb0bb5","Type":"ContainerStarted","Data":"e9f6faedafec22c9e6bfb53ba5c487a41d1648f4b7bf69febdc4bddbd41264a4"} Mar 21 04:02:23 crc kubenswrapper[4685]: I0321 04:02:23.919920 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-6c46fc7bc5-hpt7d" Mar 21 04:02:23 crc kubenswrapper[4685]: I0321 04:02:23.938324 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-6c46fc7bc5-hpt7d" podStartSLOduration=1.811597811 podStartE2EDuration="3.938310668s" podCreationTimestamp="2026-03-21 04:02:20 +0000 UTC" firstStartedPulling="2026-03-21 04:02:21.446053113 +0000 UTC m=+973.923121905" lastFinishedPulling="2026-03-21 04:02:23.57276597 +0000 UTC m=+976.049834762" observedRunningTime="2026-03-21 04:02:23.934506529 +0000 UTC m=+976.411575331" watchObservedRunningTime="2026-03-21 04:02:23.938310668 +0000 UTC m=+976.415379460" Mar 21 04:02:24 crc kubenswrapper[4685]: I0321 04:02:24.851210 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/openstack-galera-0"] Mar 21 04:02:24 crc kubenswrapper[4685]: I0321 04:02:24.852232 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/openstack-galera-0" Mar 21 04:02:24 crc kubenswrapper[4685]: I0321 04:02:24.855561 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"barbican-kuttl-tests"/"kube-root-ca.crt" Mar 21 04:02:24 crc kubenswrapper[4685]: I0321 04:02:24.855715 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"barbican-kuttl-tests"/"openstack-config-data" Mar 21 04:02:24 crc kubenswrapper[4685]: I0321 04:02:24.855760 4685 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"galera-openstack-dockercfg-ntnkl" Mar 21 04:02:24 crc kubenswrapper[4685]: I0321 04:02:24.857089 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"barbican-kuttl-tests"/"openstack-scripts" Mar 21 04:02:24 crc kubenswrapper[4685]: I0321 04:02:24.860294 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"barbican-kuttl-tests"/"openshift-service-ca.crt" Mar 21 04:02:24 crc kubenswrapper[4685]: I0321 04:02:24.868073 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/openstack-galera-0"] Mar 21 04:02:24 crc kubenswrapper[4685]: I0321 04:02:24.872021 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/openstack-galera-2"] Mar 21 04:02:24 crc kubenswrapper[4685]: I0321 04:02:24.872941 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/openstack-galera-2" Mar 21 04:02:24 crc kubenswrapper[4685]: I0321 04:02:24.878390 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/openstack-galera-1"] Mar 21 04:02:24 crc kubenswrapper[4685]: I0321 04:02:24.879445 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/openstack-galera-1" Mar 21 04:02:24 crc kubenswrapper[4685]: I0321 04:02:24.883540 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/openstack-galera-2"] Mar 21 04:02:24 crc kubenswrapper[4685]: I0321 04:02:24.902771 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/openstack-galera-1"] Mar 21 04:02:24 crc kubenswrapper[4685]: I0321 04:02:24.978134 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xgm6\" (UniqueName: \"kubernetes.io/projected/a69263a8-bd4d-476c-99fc-f1202f36f8a0-kube-api-access-6xgm6\") pod \"openstack-galera-0\" (UID: \"a69263a8-bd4d-476c-99fc-f1202f36f8a0\") " pod="barbican-kuttl-tests/openstack-galera-0" Mar 21 04:02:24 crc kubenswrapper[4685]: I0321 04:02:24.978193 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a69263a8-bd4d-476c-99fc-f1202f36f8a0-config-data-generated\") pod \"openstack-galera-0\" (UID: \"a69263a8-bd4d-476c-99fc-f1202f36f8a0\") " pod="barbican-kuttl-tests/openstack-galera-0" Mar 21 04:02:24 crc kubenswrapper[4685]: I0321 04:02:24.978215 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-2\" (UID: \"9ee610c4-8416-4d1c-a6b4-2324f1541b1c\") " pod="barbican-kuttl-tests/openstack-galera-2" Mar 21 04:02:24 crc kubenswrapper[4685]: I0321 04:02:24.978233 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ee610c4-8416-4d1c-a6b4-2324f1541b1c-operator-scripts\") pod \"openstack-galera-2\" (UID: \"9ee610c4-8416-4d1c-a6b4-2324f1541b1c\") " pod="barbican-kuttl-tests/openstack-galera-2" Mar 21 04:02:24 crc kubenswrapper[4685]: I0321 04:02:24.978259 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-1\" (UID: \"ba532d6b-607c-450f-adb7-8d4e14ff58e0\") " pod="barbican-kuttl-tests/openstack-galera-1" Mar 21 04:02:24 crc kubenswrapper[4685]: I0321 04:02:24.978273 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a69263a8-bd4d-476c-99fc-f1202f36f8a0-kolla-config\") pod \"openstack-galera-0\" (UID: \"a69263a8-bd4d-476c-99fc-f1202f36f8a0\") " pod="barbican-kuttl-tests/openstack-galera-0" Mar 21 04:02:24 crc kubenswrapper[4685]: I0321 04:02:24.978289 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"a69263a8-bd4d-476c-99fc-f1202f36f8a0\") " pod="barbican-kuttl-tests/openstack-galera-0" Mar 21 04:02:24 crc kubenswrapper[4685]: I0321 04:02:24.978304 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba532d6b-607c-450f-adb7-8d4e14ff58e0-operator-scripts\") pod \"openstack-galera-1\" (UID: \"ba532d6b-607c-450f-adb7-8d4e14ff58e0\") " pod="barbican-kuttl-tests/openstack-galera-1" Mar 21 04:02:24 crc kubenswrapper[4685]: I0321 04:02:24.978326 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9ee610c4-8416-4d1c-a6b4-2324f1541b1c-config-data-default\") pod \"openstack-galera-2\" (UID: \"9ee610c4-8416-4d1c-a6b4-2324f1541b1c\") " pod="barbican-kuttl-tests/openstack-galera-2" Mar 21 04:02:24 crc kubenswrapper[4685]: I0321 04:02:24.978547 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9ee610c4-8416-4d1c-a6b4-2324f1541b1c-kolla-config\") pod \"openstack-galera-2\" (UID: \"9ee610c4-8416-4d1c-a6b4-2324f1541b1c\") " pod="barbican-kuttl-tests/openstack-galera-2" Mar 21 04:02:24 crc kubenswrapper[4685]: I0321 04:02:24.978606 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzqbf\" (UniqueName: \"kubernetes.io/projected/ba532d6b-607c-450f-adb7-8d4e14ff58e0-kube-api-access-jzqbf\") pod \"openstack-galera-1\" (UID: \"ba532d6b-607c-450f-adb7-8d4e14ff58e0\") " pod="barbican-kuttl-tests/openstack-galera-1" Mar 21 04:02:24 crc kubenswrapper[4685]: I0321 04:02:24.978628 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a69263a8-bd4d-476c-99fc-f1202f36f8a0-config-data-default\") pod \"openstack-galera-0\" (UID: \"a69263a8-bd4d-476c-99fc-f1202f36f8a0\") " pod="barbican-kuttl-tests/openstack-galera-0" Mar 21 04:02:24 crc kubenswrapper[4685]: I0321 04:02:24.978660 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a69263a8-bd4d-476c-99fc-f1202f36f8a0-operator-scripts\") pod \"openstack-galera-0\" (UID: \"a69263a8-bd4d-476c-99fc-f1202f36f8a0\") " pod="barbican-kuttl-tests/openstack-galera-0" Mar 21 04:02:24 crc kubenswrapper[4685]: I0321 04:02:24.978720 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48qd2\" (UniqueName: \"kubernetes.io/projected/9ee610c4-8416-4d1c-a6b4-2324f1541b1c-kube-api-access-48qd2\") pod \"openstack-galera-2\" (UID: \"9ee610c4-8416-4d1c-a6b4-2324f1541b1c\") " pod="barbican-kuttl-tests/openstack-galera-2" Mar 21 04:02:24 crc kubenswrapper[4685]: I0321 04:02:24.978766 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ba532d6b-607c-450f-adb7-8d4e14ff58e0-config-data-default\") pod \"openstack-galera-1\" (UID: \"ba532d6b-607c-450f-adb7-8d4e14ff58e0\") " pod="barbican-kuttl-tests/openstack-galera-1" Mar 21 04:02:24 crc kubenswrapper[4685]: I0321 04:02:24.978798 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9ee610c4-8416-4d1c-a6b4-2324f1541b1c-config-data-generated\") pod \"openstack-galera-2\" (UID: \"9ee610c4-8416-4d1c-a6b4-2324f1541b1c\") " pod="barbican-kuttl-tests/openstack-galera-2" Mar 21 04:02:24 crc kubenswrapper[4685]: I0321 04:02:24.978870 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ba532d6b-607c-450f-adb7-8d4e14ff58e0-kolla-config\") pod \"openstack-galera-1\" (UID: \"ba532d6b-607c-450f-adb7-8d4e14ff58e0\") " pod="barbican-kuttl-tests/openstack-galera-1" Mar 21 04:02:24 crc kubenswrapper[4685]: I0321 04:02:24.978892 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ba532d6b-607c-450f-adb7-8d4e14ff58e0-config-data-generated\") pod \"openstack-galera-1\" (UID: \"ba532d6b-607c-450f-adb7-8d4e14ff58e0\") " pod="barbican-kuttl-tests/openstack-galera-1" Mar 21 04:02:25 crc kubenswrapper[4685]: I0321 04:02:25.082317 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9ee610c4-8416-4d1c-a6b4-2324f1541b1c-kolla-config\") pod \"openstack-galera-2\" (UID: \"9ee610c4-8416-4d1c-a6b4-2324f1541b1c\") " pod="barbican-kuttl-tests/openstack-galera-2" Mar 21 04:02:25 crc kubenswrapper[4685]: I0321 04:02:25.082361 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzqbf\" (UniqueName: \"kubernetes.io/projected/ba532d6b-607c-450f-adb7-8d4e14ff58e0-kube-api-access-jzqbf\") pod \"openstack-galera-1\" (UID: \"ba532d6b-607c-450f-adb7-8d4e14ff58e0\") " pod="barbican-kuttl-tests/openstack-galera-1" Mar 21 04:02:25 crc kubenswrapper[4685]: I0321 04:02:25.082378 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a69263a8-bd4d-476c-99fc-f1202f36f8a0-config-data-default\") pod \"openstack-galera-0\" (UID: \"a69263a8-bd4d-476c-99fc-f1202f36f8a0\") " pod="barbican-kuttl-tests/openstack-galera-0" Mar 21 04:02:25 crc kubenswrapper[4685]: I0321 04:02:25.082403 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a69263a8-bd4d-476c-99fc-f1202f36f8a0-operator-scripts\") pod \"openstack-galera-0\" (UID: \"a69263a8-bd4d-476c-99fc-f1202f36f8a0\") " pod="barbican-kuttl-tests/openstack-galera-0" Mar 21 04:02:25 crc kubenswrapper[4685]: I0321 04:02:25.082422 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48qd2\" (UniqueName: \"kubernetes.io/projected/9ee610c4-8416-4d1c-a6b4-2324f1541b1c-kube-api-access-48qd2\") pod \"openstack-galera-2\" (UID: \"9ee610c4-8416-4d1c-a6b4-2324f1541b1c\") " pod="barbican-kuttl-tests/openstack-galera-2" Mar 21 04:02:25 crc kubenswrapper[4685]: I0321 04:02:25.082450 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ba532d6b-607c-450f-adb7-8d4e14ff58e0-config-data-default\") pod \"openstack-galera-1\" (UID: \"ba532d6b-607c-450f-adb7-8d4e14ff58e0\") " pod="barbican-kuttl-tests/openstack-galera-1" Mar 21 04:02:25 crc kubenswrapper[4685]: I0321 04:02:25.082469 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9ee610c4-8416-4d1c-a6b4-2324f1541b1c-config-data-generated\") pod \"openstack-galera-2\" (UID: \"9ee610c4-8416-4d1c-a6b4-2324f1541b1c\") " pod="barbican-kuttl-tests/openstack-galera-2" Mar 21 04:02:25 crc kubenswrapper[4685]: I0321 04:02:25.082492 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ba532d6b-607c-450f-adb7-8d4e14ff58e0-kolla-config\") pod \"openstack-galera-1\" (UID: \"ba532d6b-607c-450f-adb7-8d4e14ff58e0\") " pod="barbican-kuttl-tests/openstack-galera-1" Mar 21 04:02:25 crc kubenswrapper[4685]: I0321 04:02:25.082508 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ba532d6b-607c-450f-adb7-8d4e14ff58e0-config-data-generated\") pod \"openstack-galera-1\" (UID: \"ba532d6b-607c-450f-adb7-8d4e14ff58e0\") " pod="barbican-kuttl-tests/openstack-galera-1" Mar 21 04:02:25 crc kubenswrapper[4685]: I0321 04:02:25.082528 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xgm6\" (UniqueName: \"kubernetes.io/projected/a69263a8-bd4d-476c-99fc-f1202f36f8a0-kube-api-access-6xgm6\") pod \"openstack-galera-0\" (UID: \"a69263a8-bd4d-476c-99fc-f1202f36f8a0\") " pod="barbican-kuttl-tests/openstack-galera-0" Mar 21 04:02:25 crc kubenswrapper[4685]: I0321 04:02:25.082560 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a69263a8-bd4d-476c-99fc-f1202f36f8a0-config-data-generated\") pod \"openstack-galera-0\" (UID: \"a69263a8-bd4d-476c-99fc-f1202f36f8a0\") " pod="barbican-kuttl-tests/openstack-galera-0" Mar 21 04:02:25 crc kubenswrapper[4685]: I0321 04:02:25.082584 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-2\" (UID: \"9ee610c4-8416-4d1c-a6b4-2324f1541b1c\") " pod="barbican-kuttl-tests/openstack-galera-2" Mar 21 04:02:25 crc kubenswrapper[4685]: I0321 04:02:25.082601 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ee610c4-8416-4d1c-a6b4-2324f1541b1c-operator-scripts\") pod \"openstack-galera-2\" (UID: \"9ee610c4-8416-4d1c-a6b4-2324f1541b1c\") " pod="barbican-kuttl-tests/openstack-galera-2" Mar 21 04:02:25 crc kubenswrapper[4685]: I0321 04:02:25.082623 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-1\" (UID: \"ba532d6b-607c-450f-adb7-8d4e14ff58e0\") " pod="barbican-kuttl-tests/openstack-galera-1" Mar 21 04:02:25 crc kubenswrapper[4685]: I0321 04:02:25.082639 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a69263a8-bd4d-476c-99fc-f1202f36f8a0-kolla-config\") pod \"openstack-galera-0\" (UID: \"a69263a8-bd4d-476c-99fc-f1202f36f8a0\") " pod="barbican-kuttl-tests/openstack-galera-0" Mar 21 04:02:25 crc kubenswrapper[4685]: I0321 04:02:25.082658 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"a69263a8-bd4d-476c-99fc-f1202f36f8a0\") " pod="barbican-kuttl-tests/openstack-galera-0" Mar 21 04:02:25 crc kubenswrapper[4685]: I0321 04:02:25.082675 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba532d6b-607c-450f-adb7-8d4e14ff58e0-operator-scripts\") pod \"openstack-galera-1\" (UID: \"ba532d6b-607c-450f-adb7-8d4e14ff58e0\") " pod="barbican-kuttl-tests/openstack-galera-1" Mar 21 04:02:25 crc kubenswrapper[4685]: I0321 04:02:25.082697 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9ee610c4-8416-4d1c-a6b4-2324f1541b1c-config-data-default\") pod \"openstack-galera-2\" (UID: \"9ee610c4-8416-4d1c-a6b4-2324f1541b1c\") " pod="barbican-kuttl-tests/openstack-galera-2" Mar 21 04:02:25 crc kubenswrapper[4685]: I0321 04:02:25.083041 4685 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-1\" (UID: \"ba532d6b-607c-450f-adb7-8d4e14ff58e0\") device mount path \"/mnt/openstack/pv10\"" pod="barbican-kuttl-tests/openstack-galera-1" Mar 21 04:02:25 crc kubenswrapper[4685]: I0321 04:02:25.083201 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9ee610c4-8416-4d1c-a6b4-2324f1541b1c-kolla-config\") pod \"openstack-galera-2\" (UID: \"9ee610c4-8416-4d1c-a6b4-2324f1541b1c\") " pod="barbican-kuttl-tests/openstack-galera-2" Mar 21 04:02:25 crc kubenswrapper[4685]: I0321 04:02:25.083214 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ba532d6b-607c-450f-adb7-8d4e14ff58e0-config-data-default\") pod \"openstack-galera-1\" (UID: \"ba532d6b-607c-450f-adb7-8d4e14ff58e0\") " pod="barbican-kuttl-tests/openstack-galera-1" Mar 21 04:02:25 crc kubenswrapper[4685]: I0321 04:02:25.083298 4685 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"a69263a8-bd4d-476c-99fc-f1202f36f8a0\") device mount path \"/mnt/openstack/pv11\"" pod="barbican-kuttl-tests/openstack-galera-0" Mar 21 04:02:25 crc kubenswrapper[4685]: I0321 04:02:25.083410 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9ee610c4-8416-4d1c-a6b4-2324f1541b1c-config-data-default\") pod \"openstack-galera-2\" (UID: \"9ee610c4-8416-4d1c-a6b4-2324f1541b1c\") " pod="barbican-kuttl-tests/openstack-galera-2" Mar 21 04:02:25 crc kubenswrapper[4685]: I0321 04:02:25.083621 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9ee610c4-8416-4d1c-a6b4-2324f1541b1c-config-data-generated\") pod \"openstack-galera-2\" (UID: \"9ee610c4-8416-4d1c-a6b4-2324f1541b1c\") " pod="barbican-kuttl-tests/openstack-galera-2" Mar 21 04:02:25 crc kubenswrapper[4685]: I0321 04:02:25.084119 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ba532d6b-607c-450f-adb7-8d4e14ff58e0-kolla-config\") pod \"openstack-galera-1\" (UID: \"ba532d6b-607c-450f-adb7-8d4e14ff58e0\") " pod="barbican-kuttl-tests/openstack-galera-1" Mar 21 04:02:25 crc kubenswrapper[4685]: I0321 04:02:25.084356 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ba532d6b-607c-450f-adb7-8d4e14ff58e0-config-data-generated\") pod \"openstack-galera-1\" (UID: \"ba532d6b-607c-450f-adb7-8d4e14ff58e0\") " pod="barbican-kuttl-tests/openstack-galera-1" Mar 21 04:02:25 crc kubenswrapper[4685]: I0321 04:02:25.084449 4685 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-2\" (UID: \"9ee610c4-8416-4d1c-a6b4-2324f1541b1c\") device mount path \"/mnt/openstack/pv02\"" pod="barbican-kuttl-tests/openstack-galera-2" Mar 21 04:02:25 crc kubenswrapper[4685]: I0321 04:02:25.084855 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba532d6b-607c-450f-adb7-8d4e14ff58e0-operator-scripts\") pod \"openstack-galera-1\" (UID: \"ba532d6b-607c-450f-adb7-8d4e14ff58e0\") " pod="barbican-kuttl-tests/openstack-galera-1" Mar 21 04:02:25 crc kubenswrapper[4685]: I0321 04:02:25.086002 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ee610c4-8416-4d1c-a6b4-2324f1541b1c-operator-scripts\") pod \"openstack-galera-2\" (UID: \"9ee610c4-8416-4d1c-a6b4-2324f1541b1c\") " pod="barbican-kuttl-tests/openstack-galera-2" Mar 21 04:02:25 crc kubenswrapper[4685]: I0321 04:02:25.089371 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a69263a8-bd4d-476c-99fc-f1202f36f8a0-config-data-generated\") pod \"openstack-galera-0\" (UID: \"a69263a8-bd4d-476c-99fc-f1202f36f8a0\") " pod="barbican-kuttl-tests/openstack-galera-0" Mar 21 04:02:25 crc kubenswrapper[4685]: I0321 04:02:25.089683 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a69263a8-bd4d-476c-99fc-f1202f36f8a0-config-data-default\") pod \"openstack-galera-0\" (UID: \"a69263a8-bd4d-476c-99fc-f1202f36f8a0\") " pod="barbican-kuttl-tests/openstack-galera-0" Mar 21 04:02:25 crc kubenswrapper[4685]: I0321 04:02:25.089982 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a69263a8-bd4d-476c-99fc-f1202f36f8a0-kolla-config\") pod \"openstack-galera-0\" (UID: \"a69263a8-bd4d-476c-99fc-f1202f36f8a0\") " pod="barbican-kuttl-tests/openstack-galera-0" Mar 21 04:02:25 crc kubenswrapper[4685]: I0321 04:02:25.090720 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a69263a8-bd4d-476c-99fc-f1202f36f8a0-operator-scripts\") pod \"openstack-galera-0\" (UID: \"a69263a8-bd4d-476c-99fc-f1202f36f8a0\") " pod="barbican-kuttl-tests/openstack-galera-0" Mar 21 04:02:25 crc kubenswrapper[4685]: I0321 04:02:25.102483 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48qd2\" (UniqueName: \"kubernetes.io/projected/9ee610c4-8416-4d1c-a6b4-2324f1541b1c-kube-api-access-48qd2\") pod \"openstack-galera-2\" (UID: \"9ee610c4-8416-4d1c-a6b4-2324f1541b1c\") " pod="barbican-kuttl-tests/openstack-galera-2" Mar 21 04:02:25 crc kubenswrapper[4685]: I0321 04:02:25.102538 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-1\" (UID: \"ba532d6b-607c-450f-adb7-8d4e14ff58e0\") " pod="barbican-kuttl-tests/openstack-galera-1" Mar 21 04:02:25 crc kubenswrapper[4685]: I0321 04:02:25.103277 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-2\" (UID: \"9ee610c4-8416-4d1c-a6b4-2324f1541b1c\") " pod="barbican-kuttl-tests/openstack-galera-2" Mar 21 04:02:25 crc kubenswrapper[4685]: I0321 04:02:25.104524 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"a69263a8-bd4d-476c-99fc-f1202f36f8a0\") " pod="barbican-kuttl-tests/openstack-galera-0" Mar 21 04:02:25 crc kubenswrapper[4685]: I0321 04:02:25.110639 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xgm6\" (UniqueName: \"kubernetes.io/projected/a69263a8-bd4d-476c-99fc-f1202f36f8a0-kube-api-access-6xgm6\") pod \"openstack-galera-0\" (UID: \"a69263a8-bd4d-476c-99fc-f1202f36f8a0\") " pod="barbican-kuttl-tests/openstack-galera-0" Mar 21 04:02:25 crc kubenswrapper[4685]: I0321 04:02:25.111187 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzqbf\" (UniqueName: \"kubernetes.io/projected/ba532d6b-607c-450f-adb7-8d4e14ff58e0-kube-api-access-jzqbf\") pod \"openstack-galera-1\" (UID: \"ba532d6b-607c-450f-adb7-8d4e14ff58e0\") " pod="barbican-kuttl-tests/openstack-galera-1" Mar 21 04:02:25 crc kubenswrapper[4685]: I0321 04:02:25.169151 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/openstack-galera-0" Mar 21 04:02:25 crc kubenswrapper[4685]: I0321 04:02:25.190719 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/openstack-galera-2" Mar 21 04:02:25 crc kubenswrapper[4685]: I0321 04:02:25.197648 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/openstack-galera-1" Mar 21 04:02:25 crc kubenswrapper[4685]: I0321 04:02:25.394799 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/openstack-galera-0"] Mar 21 04:02:25 crc kubenswrapper[4685]: W0321 04:02:25.411237 4685 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda69263a8_bd4d_476c_99fc_f1202f36f8a0.slice/crio-825b5fff34eacc0ff1793d29d8ff3144045342d1536eff1ccad3868db4ab618f WatchSource:0}: Error finding container 825b5fff34eacc0ff1793d29d8ff3144045342d1536eff1ccad3868db4ab618f: Status 404 returned error can't find the container with id 825b5fff34eacc0ff1793d29d8ff3144045342d1536eff1ccad3868db4ab618f Mar 21 04:02:25 crc kubenswrapper[4685]: I0321 04:02:25.445027 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/openstack-galera-2"] Mar 21 04:02:25 crc kubenswrapper[4685]: I0321 04:02:25.478025 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/openstack-galera-1"] Mar 21 04:02:25 crc kubenswrapper[4685]: W0321 04:02:25.481200 4685 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba532d6b_607c_450f_adb7_8d4e14ff58e0.slice/crio-6ce6283a7b17cf5db4ce52dd0c93515b94c32f6a225cdc2a9ea154d431d6c3c4 WatchSource:0}: Error finding container 6ce6283a7b17cf5db4ce52dd0c93515b94c32f6a225cdc2a9ea154d431d6c3c4: Status 404 returned error can't find the container with id 6ce6283a7b17cf5db4ce52dd0c93515b94c32f6a225cdc2a9ea154d431d6c3c4 Mar 21 04:02:25 crc kubenswrapper[4685]: I0321 04:02:25.932695 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-2" event={"ID":"9ee610c4-8416-4d1c-a6b4-2324f1541b1c","Type":"ContainerStarted","Data":"adc44a604e9766d4f8e09edba3cff5d49780e00f4296f5f7913a9ba0cf852aa9"} Mar 21 04:02:25 crc kubenswrapper[4685]: I0321 04:02:25.934076 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-0" event={"ID":"a69263a8-bd4d-476c-99fc-f1202f36f8a0","Type":"ContainerStarted","Data":"825b5fff34eacc0ff1793d29d8ff3144045342d1536eff1ccad3868db4ab618f"} Mar 21 04:02:25 crc kubenswrapper[4685]: I0321 04:02:25.935207 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-1" event={"ID":"ba532d6b-607c-450f-adb7-8d4e14ff58e0","Type":"ContainerStarted","Data":"6ce6283a7b17cf5db4ce52dd0c93515b94c32f6a225cdc2a9ea154d431d6c3c4"} Mar 21 04:02:28 crc kubenswrapper[4685]: I0321 04:02:28.048408 4685 scope.go:117] "RemoveContainer" containerID="ec65004c87a3ece56139f7b03f69c9b9926eef8f676f1143526c439647935efa" Mar 21 04:02:31 crc kubenswrapper[4685]: I0321 04:02:31.003649 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-6c46fc7bc5-hpt7d" Mar 21 04:02:31 crc kubenswrapper[4685]: I0321 04:02:31.529371 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/memcached-0"] Mar 21 04:02:31 crc kubenswrapper[4685]: I0321 04:02:31.530061 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/memcached-0" Mar 21 04:02:31 crc kubenswrapper[4685]: I0321 04:02:31.531709 4685 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"memcached-memcached-dockercfg-z6v85" Mar 21 04:02:31 crc kubenswrapper[4685]: I0321 04:02:31.533690 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"barbican-kuttl-tests"/"memcached-config-data" Mar 21 04:02:31 crc kubenswrapper[4685]: I0321 04:02:31.546167 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/memcached-0"] Mar 21 04:02:31 crc kubenswrapper[4685]: I0321 04:02:31.676772 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhlbn\" (UniqueName: \"kubernetes.io/projected/91653387-c2f1-4240-b710-e0c709eb769d-kube-api-access-nhlbn\") pod \"memcached-0\" (UID: \"91653387-c2f1-4240-b710-e0c709eb769d\") " pod="barbican-kuttl-tests/memcached-0" Mar 21 04:02:31 crc kubenswrapper[4685]: I0321 04:02:31.676891 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/91653387-c2f1-4240-b710-e0c709eb769d-config-data\") pod \"memcached-0\" (UID: \"91653387-c2f1-4240-b710-e0c709eb769d\") " pod="barbican-kuttl-tests/memcached-0" Mar 21 04:02:31 crc kubenswrapper[4685]: I0321 04:02:31.676923 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/91653387-c2f1-4240-b710-e0c709eb769d-kolla-config\") pod \"memcached-0\" (UID: \"91653387-c2f1-4240-b710-e0c709eb769d\") " pod="barbican-kuttl-tests/memcached-0" Mar 21 04:02:31 crc kubenswrapper[4685]: I0321 04:02:31.778271 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhlbn\" (UniqueName: \"kubernetes.io/projected/91653387-c2f1-4240-b710-e0c709eb769d-kube-api-access-nhlbn\") pod \"memcached-0\" (UID: \"91653387-c2f1-4240-b710-e0c709eb769d\") " pod="barbican-kuttl-tests/memcached-0" Mar 21 04:02:31 crc kubenswrapper[4685]: I0321 04:02:31.778357 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/91653387-c2f1-4240-b710-e0c709eb769d-config-data\") pod \"memcached-0\" (UID: \"91653387-c2f1-4240-b710-e0c709eb769d\") " pod="barbican-kuttl-tests/memcached-0" Mar 21 04:02:31 crc kubenswrapper[4685]: I0321 04:02:31.778378 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/91653387-c2f1-4240-b710-e0c709eb769d-kolla-config\") pod \"memcached-0\" (UID: \"91653387-c2f1-4240-b710-e0c709eb769d\") " pod="barbican-kuttl-tests/memcached-0" Mar 21 04:02:31 crc kubenswrapper[4685]: I0321 04:02:31.779163 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/91653387-c2f1-4240-b710-e0c709eb769d-kolla-config\") pod \"memcached-0\" (UID: \"91653387-c2f1-4240-b710-e0c709eb769d\") " pod="barbican-kuttl-tests/memcached-0" Mar 21 04:02:31 crc kubenswrapper[4685]: I0321 04:02:31.779912 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/91653387-c2f1-4240-b710-e0c709eb769d-config-data\") pod \"memcached-0\" (UID: \"91653387-c2f1-4240-b710-e0c709eb769d\") " pod="barbican-kuttl-tests/memcached-0" Mar 21 04:02:31 crc kubenswrapper[4685]: I0321 04:02:31.800128 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhlbn\" (UniqueName: \"kubernetes.io/projected/91653387-c2f1-4240-b710-e0c709eb769d-kube-api-access-nhlbn\") pod \"memcached-0\" (UID: \"91653387-c2f1-4240-b710-e0c709eb769d\") " pod="barbican-kuttl-tests/memcached-0" Mar 21 04:02:31 crc kubenswrapper[4685]: I0321 04:02:31.846857 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/memcached-0" Mar 21 04:02:32 crc kubenswrapper[4685]: I0321 04:02:32.985375 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-2" event={"ID":"9ee610c4-8416-4d1c-a6b4-2324f1541b1c","Type":"ContainerStarted","Data":"afd181791b9ffd26344f58b9bd55f908ec8be4288ead7e37013df654d16e931b"} Mar 21 04:02:32 crc kubenswrapper[4685]: I0321 04:02:32.987476 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-0" event={"ID":"a69263a8-bd4d-476c-99fc-f1202f36f8a0","Type":"ContainerStarted","Data":"1a22266022bad721a506164b62c635ec56de1ed3b509234096cfdda97ceead41"} Mar 21 04:02:32 crc kubenswrapper[4685]: I0321 04:02:32.989488 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-1" event={"ID":"ba532d6b-607c-450f-adb7-8d4e14ff58e0","Type":"ContainerStarted","Data":"571f7affd534842985c2e3a136e5f111bc11cbc05ccb19a322fe4c8369044e8d"} Mar 21 04:02:32 crc kubenswrapper[4685]: I0321 04:02:32.998656 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/memcached-0"] Mar 21 04:02:33 crc kubenswrapper[4685]: I0321 04:02:33.998227 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/memcached-0" event={"ID":"91653387-c2f1-4240-b710-e0c709eb769d","Type":"ContainerStarted","Data":"029a15848e86b090e45728193416b05a946b75994bc4c7dbbeb230458a7ae8ea"} Mar 21 04:02:34 crc kubenswrapper[4685]: I0321 04:02:34.442804 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-qpvg8"] Mar 21 04:02:34 crc kubenswrapper[4685]: I0321 04:02:34.443684 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-qpvg8" Mar 21 04:02:34 crc kubenswrapper[4685]: I0321 04:02:34.445517 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-index-dockercfg-bb5wb" Mar 21 04:02:34 crc kubenswrapper[4685]: I0321 04:02:34.462253 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-qpvg8"] Mar 21 04:02:34 crc kubenswrapper[4685]: I0321 04:02:34.521460 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7lqg\" (UniqueName: \"kubernetes.io/projected/67f7d3a9-f1d6-4c73-a95f-8783abe85339-kube-api-access-r7lqg\") pod \"rabbitmq-cluster-operator-index-qpvg8\" (UID: \"67f7d3a9-f1d6-4c73-a95f-8783abe85339\") " pod="openstack-operators/rabbitmq-cluster-operator-index-qpvg8" Mar 21 04:02:34 crc kubenswrapper[4685]: I0321 04:02:34.622919 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7lqg\" (UniqueName: \"kubernetes.io/projected/67f7d3a9-f1d6-4c73-a95f-8783abe85339-kube-api-access-r7lqg\") pod \"rabbitmq-cluster-operator-index-qpvg8\" (UID: \"67f7d3a9-f1d6-4c73-a95f-8783abe85339\") " pod="openstack-operators/rabbitmq-cluster-operator-index-qpvg8" Mar 21 04:02:34 crc kubenswrapper[4685]: I0321 04:02:34.642234 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7lqg\" (UniqueName: \"kubernetes.io/projected/67f7d3a9-f1d6-4c73-a95f-8783abe85339-kube-api-access-r7lqg\") pod \"rabbitmq-cluster-operator-index-qpvg8\" (UID: \"67f7d3a9-f1d6-4c73-a95f-8783abe85339\") " pod="openstack-operators/rabbitmq-cluster-operator-index-qpvg8" Mar 21 04:02:34 crc kubenswrapper[4685]: I0321 04:02:34.774698 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-qpvg8" Mar 21 04:02:35 crc kubenswrapper[4685]: I0321 04:02:35.005177 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/memcached-0" event={"ID":"91653387-c2f1-4240-b710-e0c709eb769d","Type":"ContainerStarted","Data":"78e1e20967447cec136443726f8d2e76ac857183f77e3fec410e0f394f81503a"} Mar 21 04:02:35 crc kubenswrapper[4685]: I0321 04:02:35.005541 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="barbican-kuttl-tests/memcached-0" Mar 21 04:02:35 crc kubenswrapper[4685]: I0321 04:02:35.024811 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/memcached-0" podStartSLOduration=2.269333283 podStartE2EDuration="4.024792954s" podCreationTimestamp="2026-03-21 04:02:31 +0000 UTC" firstStartedPulling="2026-03-21 04:02:33.02171081 +0000 UTC m=+985.498779602" lastFinishedPulling="2026-03-21 04:02:34.777170481 +0000 UTC m=+987.254239273" observedRunningTime="2026-03-21 04:02:35.02324774 +0000 UTC m=+987.500316532" watchObservedRunningTime="2026-03-21 04:02:35.024792954 +0000 UTC m=+987.501861746" Mar 21 04:02:35 crc kubenswrapper[4685]: I0321 04:02:35.178741 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-qpvg8"] Mar 21 04:02:35 crc kubenswrapper[4685]: W0321 04:02:35.185587 4685 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67f7d3a9_f1d6_4c73_a95f_8783abe85339.slice/crio-484fea16b29e4cbd6ec3b1f0b7fb81234d7fd2c345a58172264ff7ff49d95077 WatchSource:0}: Error finding container 484fea16b29e4cbd6ec3b1f0b7fb81234d7fd2c345a58172264ff7ff49d95077: Status 404 returned error can't find the container with id 484fea16b29e4cbd6ec3b1f0b7fb81234d7fd2c345a58172264ff7ff49d95077 Mar 21 04:02:36 crc kubenswrapper[4685]: I0321 04:02:36.013155 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-qpvg8" event={"ID":"67f7d3a9-f1d6-4c73-a95f-8783abe85339","Type":"ContainerStarted","Data":"484fea16b29e4cbd6ec3b1f0b7fb81234d7fd2c345a58172264ff7ff49d95077"} Mar 21 04:02:37 crc kubenswrapper[4685]: I0321 04:02:37.021968 4685 generic.go:334] "Generic (PLEG): container finished" podID="9ee610c4-8416-4d1c-a6b4-2324f1541b1c" containerID="afd181791b9ffd26344f58b9bd55f908ec8be4288ead7e37013df654d16e931b" exitCode=0 Mar 21 04:02:37 crc kubenswrapper[4685]: I0321 04:02:37.022060 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-2" event={"ID":"9ee610c4-8416-4d1c-a6b4-2324f1541b1c","Type":"ContainerDied","Data":"afd181791b9ffd26344f58b9bd55f908ec8be4288ead7e37013df654d16e931b"} Mar 21 04:02:37 crc kubenswrapper[4685]: I0321 04:02:37.024036 4685 generic.go:334] "Generic (PLEG): container finished" podID="a69263a8-bd4d-476c-99fc-f1202f36f8a0" containerID="1a22266022bad721a506164b62c635ec56de1ed3b509234096cfdda97ceead41" exitCode=0 Mar 21 04:02:37 crc kubenswrapper[4685]: I0321 04:02:37.024081 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-0" event={"ID":"a69263a8-bd4d-476c-99fc-f1202f36f8a0","Type":"ContainerDied","Data":"1a22266022bad721a506164b62c635ec56de1ed3b509234096cfdda97ceead41"} Mar 21 04:02:37 crc kubenswrapper[4685]: I0321 04:02:37.026229 4685 generic.go:334] "Generic (PLEG): container finished" podID="ba532d6b-607c-450f-adb7-8d4e14ff58e0" containerID="571f7affd534842985c2e3a136e5f111bc11cbc05ccb19a322fe4c8369044e8d" exitCode=0 Mar 21 04:02:37 crc kubenswrapper[4685]: I0321 04:02:37.026248 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-1" event={"ID":"ba532d6b-607c-450f-adb7-8d4e14ff58e0","Type":"ContainerDied","Data":"571f7affd534842985c2e3a136e5f111bc11cbc05ccb19a322fe4c8369044e8d"} Mar 21 04:02:38 crc kubenswrapper[4685]: I0321 04:02:38.837967 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-qpvg8"] Mar 21 04:02:39 crc kubenswrapper[4685]: I0321 04:02:39.037339 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-1" event={"ID":"ba532d6b-607c-450f-adb7-8d4e14ff58e0","Type":"ContainerStarted","Data":"1bdf9e4791a3ad52a31be260ed2573c28a204e6cfe08eb84afa1701f1ae9ad1e"} Mar 21 04:02:39 crc kubenswrapper[4685]: I0321 04:02:39.039766 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-qpvg8" event={"ID":"67f7d3a9-f1d6-4c73-a95f-8783abe85339","Type":"ContainerStarted","Data":"d3f78b60c6f40fc975525bc707ef27659b213f077105af5c96ef9d68d196e90f"} Mar 21 04:02:39 crc kubenswrapper[4685]: I0321 04:02:39.044001 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-2" event={"ID":"9ee610c4-8416-4d1c-a6b4-2324f1541b1c","Type":"ContainerStarted","Data":"2f406f57b1aed489f01d7e0d6eabda6a3b46be0f98918dc327c2eb44ae2b8f20"} Mar 21 04:02:39 crc kubenswrapper[4685]: I0321 04:02:39.045978 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-0" event={"ID":"a69263a8-bd4d-476c-99fc-f1202f36f8a0","Type":"ContainerStarted","Data":"6ef5c5d3b88cab93dc79845560aae01f51beff644d970c48e0096ebed008e109"} Mar 21 04:02:39 crc kubenswrapper[4685]: I0321 04:02:39.116893 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/openstack-galera-0" podStartSLOduration=8.942552324 podStartE2EDuration="16.116879056s" podCreationTimestamp="2026-03-21 04:02:23 +0000 UTC" firstStartedPulling="2026-03-21 04:02:25.413084492 +0000 UTC m=+977.890153284" lastFinishedPulling="2026-03-21 04:02:32.587411234 +0000 UTC m=+985.064480016" observedRunningTime="2026-03-21 04:02:39.11457417 +0000 UTC m=+991.591642982" watchObservedRunningTime="2026-03-21 04:02:39.116879056 +0000 UTC m=+991.593947848" Mar 21 04:02:39 crc kubenswrapper[4685]: I0321 04:02:39.117256 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/openstack-galera-1" podStartSLOduration=8.986778833 podStartE2EDuration="16.117250897s" podCreationTimestamp="2026-03-21 04:02:23 +0000 UTC" firstStartedPulling="2026-03-21 04:02:25.482640292 +0000 UTC m=+977.959709084" lastFinishedPulling="2026-03-21 04:02:32.613112356 +0000 UTC m=+985.090181148" observedRunningTime="2026-03-21 04:02:39.07926978 +0000 UTC m=+991.556338572" watchObservedRunningTime="2026-03-21 04:02:39.117250897 +0000 UTC m=+991.594319689" Mar 21 04:02:39 crc kubenswrapper[4685]: I0321 04:02:39.133483 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-index-qpvg8" podStartSLOduration=1.8268876459999999 podStartE2EDuration="5.133468356s" podCreationTimestamp="2026-03-21 04:02:34 +0000 UTC" firstStartedPulling="2026-03-21 04:02:35.187902016 +0000 UTC m=+987.664970808" lastFinishedPulling="2026-03-21 04:02:38.494482726 +0000 UTC m=+990.971551518" observedRunningTime="2026-03-21 04:02:39.132973251 +0000 UTC m=+991.610042043" watchObservedRunningTime="2026-03-21 04:02:39.133468356 +0000 UTC m=+991.610537148" Mar 21 04:02:39 crc kubenswrapper[4685]: I0321 04:02:39.161637 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/openstack-galera-2" podStartSLOduration=8.950465774 podStartE2EDuration="16.161619109s" podCreationTimestamp="2026-03-21 04:02:23 +0000 UTC" firstStartedPulling="2026-03-21 04:02:25.463009585 +0000 UTC m=+977.940078377" lastFinishedPulling="2026-03-21 04:02:32.67416293 +0000 UTC m=+985.151231712" observedRunningTime="2026-03-21 04:02:39.159325893 +0000 UTC m=+991.636394695" watchObservedRunningTime="2026-03-21 04:02:39.161619109 +0000 UTC m=+991.638687891" Mar 21 04:02:39 crc kubenswrapper[4685]: I0321 04:02:39.448259 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-vl4k5"] Mar 21 04:02:39 crc kubenswrapper[4685]: I0321 04:02:39.449036 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-vl4k5" Mar 21 04:02:39 crc kubenswrapper[4685]: I0321 04:02:39.471123 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-vl4k5"] Mar 21 04:02:39 crc kubenswrapper[4685]: I0321 04:02:39.498028 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8r4n\" (UniqueName: \"kubernetes.io/projected/f07c1e30-7d41-456a-adb4-2d042b562bf7-kube-api-access-v8r4n\") pod \"rabbitmq-cluster-operator-index-vl4k5\" (UID: \"f07c1e30-7d41-456a-adb4-2d042b562bf7\") " pod="openstack-operators/rabbitmq-cluster-operator-index-vl4k5" Mar 21 04:02:39 crc kubenswrapper[4685]: I0321 04:02:39.599774 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8r4n\" (UniqueName: \"kubernetes.io/projected/f07c1e30-7d41-456a-adb4-2d042b562bf7-kube-api-access-v8r4n\") pod \"rabbitmq-cluster-operator-index-vl4k5\" (UID: \"f07c1e30-7d41-456a-adb4-2d042b562bf7\") " pod="openstack-operators/rabbitmq-cluster-operator-index-vl4k5" Mar 21 04:02:39 crc kubenswrapper[4685]: I0321 04:02:39.619195 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8r4n\" (UniqueName: \"kubernetes.io/projected/f07c1e30-7d41-456a-adb4-2d042b562bf7-kube-api-access-v8r4n\") pod \"rabbitmq-cluster-operator-index-vl4k5\" (UID: \"f07c1e30-7d41-456a-adb4-2d042b562bf7\") " pod="openstack-operators/rabbitmq-cluster-operator-index-vl4k5" Mar 21 04:02:39 crc kubenswrapper[4685]: I0321 04:02:39.685815 4685 patch_prober.go:28] interesting pod/machine-config-daemon-7r9cg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 04:02:39 crc kubenswrapper[4685]: I0321 04:02:39.685889 4685 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" podUID="cea46fe2-4e41-43ab-a069-cb30fb4e732c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 04:02:39 crc kubenswrapper[4685]: I0321 04:02:39.763458 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-vl4k5" Mar 21 04:02:40 crc kubenswrapper[4685]: I0321 04:02:40.051336 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/rabbitmq-cluster-operator-index-qpvg8" podUID="67f7d3a9-f1d6-4c73-a95f-8783abe85339" containerName="registry-server" containerID="cri-o://d3f78b60c6f40fc975525bc707ef27659b213f077105af5c96ef9d68d196e90f" gracePeriod=2 Mar 21 04:02:40 crc kubenswrapper[4685]: I0321 04:02:40.210276 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-vl4k5"] Mar 21 04:02:40 crc kubenswrapper[4685]: W0321 04:02:40.218515 4685 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf07c1e30_7d41_456a_adb4_2d042b562bf7.slice/crio-ea86bedfd21c076e329bdb65d6a975f371031d383f9a853c61d4426eedd8a1f2 WatchSource:0}: Error finding container ea86bedfd21c076e329bdb65d6a975f371031d383f9a853c61d4426eedd8a1f2: Status 404 returned error can't find the container with id ea86bedfd21c076e329bdb65d6a975f371031d383f9a853c61d4426eedd8a1f2 Mar 21 04:02:40 crc kubenswrapper[4685]: I0321 04:02:40.456452 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-qpvg8" Mar 21 04:02:40 crc kubenswrapper[4685]: I0321 04:02:40.511667 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7lqg\" (UniqueName: \"kubernetes.io/projected/67f7d3a9-f1d6-4c73-a95f-8783abe85339-kube-api-access-r7lqg\") pod \"67f7d3a9-f1d6-4c73-a95f-8783abe85339\" (UID: \"67f7d3a9-f1d6-4c73-a95f-8783abe85339\") " Mar 21 04:02:40 crc kubenswrapper[4685]: I0321 04:02:40.517949 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67f7d3a9-f1d6-4c73-a95f-8783abe85339-kube-api-access-r7lqg" (OuterVolumeSpecName: "kube-api-access-r7lqg") pod "67f7d3a9-f1d6-4c73-a95f-8783abe85339" (UID: "67f7d3a9-f1d6-4c73-a95f-8783abe85339"). InnerVolumeSpecName "kube-api-access-r7lqg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:02:40 crc kubenswrapper[4685]: I0321 04:02:40.613460 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7lqg\" (UniqueName: \"kubernetes.io/projected/67f7d3a9-f1d6-4c73-a95f-8783abe85339-kube-api-access-r7lqg\") on node \"crc\" DevicePath \"\"" Mar 21 04:02:41 crc kubenswrapper[4685]: I0321 04:02:41.060871 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-qpvg8" Mar 21 04:02:41 crc kubenswrapper[4685]: I0321 04:02:41.060881 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-qpvg8" event={"ID":"67f7d3a9-f1d6-4c73-a95f-8783abe85339","Type":"ContainerDied","Data":"d3f78b60c6f40fc975525bc707ef27659b213f077105af5c96ef9d68d196e90f"} Mar 21 04:02:41 crc kubenswrapper[4685]: I0321 04:02:41.060950 4685 scope.go:117] "RemoveContainer" containerID="d3f78b60c6f40fc975525bc707ef27659b213f077105af5c96ef9d68d196e90f" Mar 21 04:02:41 crc kubenswrapper[4685]: I0321 04:02:41.060818 4685 generic.go:334] "Generic (PLEG): container finished" podID="67f7d3a9-f1d6-4c73-a95f-8783abe85339" containerID="d3f78b60c6f40fc975525bc707ef27659b213f077105af5c96ef9d68d196e90f" exitCode=0 Mar 21 04:02:41 crc kubenswrapper[4685]: I0321 04:02:41.061156 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-qpvg8" event={"ID":"67f7d3a9-f1d6-4c73-a95f-8783abe85339","Type":"ContainerDied","Data":"484fea16b29e4cbd6ec3b1f0b7fb81234d7fd2c345a58172264ff7ff49d95077"} Mar 21 04:02:41 crc kubenswrapper[4685]: I0321 04:02:41.067488 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-vl4k5" event={"ID":"f07c1e30-7d41-456a-adb4-2d042b562bf7","Type":"ContainerStarted","Data":"99d4d708ebba1cf37ef65d04f6a9587ba4efa402ad9d1d22b7762f35a5e9e79e"} Mar 21 04:02:41 crc kubenswrapper[4685]: I0321 04:02:41.067583 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-vl4k5" event={"ID":"f07c1e30-7d41-456a-adb4-2d042b562bf7","Type":"ContainerStarted","Data":"ea86bedfd21c076e329bdb65d6a975f371031d383f9a853c61d4426eedd8a1f2"} Mar 21 04:02:41 crc kubenswrapper[4685]: I0321 04:02:41.085218 4685 scope.go:117] "RemoveContainer" containerID="d3f78b60c6f40fc975525bc707ef27659b213f077105af5c96ef9d68d196e90f" Mar 21 04:02:41 crc kubenswrapper[4685]: E0321 04:02:41.085695 4685 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3f78b60c6f40fc975525bc707ef27659b213f077105af5c96ef9d68d196e90f\": container with ID starting with d3f78b60c6f40fc975525bc707ef27659b213f077105af5c96ef9d68d196e90f not found: ID does not exist" containerID="d3f78b60c6f40fc975525bc707ef27659b213f077105af5c96ef9d68d196e90f" Mar 21 04:02:41 crc kubenswrapper[4685]: I0321 04:02:41.085738 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3f78b60c6f40fc975525bc707ef27659b213f077105af5c96ef9d68d196e90f"} err="failed to get container status \"d3f78b60c6f40fc975525bc707ef27659b213f077105af5c96ef9d68d196e90f\": rpc error: code = NotFound desc = could not find container \"d3f78b60c6f40fc975525bc707ef27659b213f077105af5c96ef9d68d196e90f\": container with ID starting with d3f78b60c6f40fc975525bc707ef27659b213f077105af5c96ef9d68d196e90f not found: ID does not exist" Mar 21 04:02:41 crc kubenswrapper[4685]: I0321 04:02:41.096727 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-index-vl4k5" podStartSLOduration=1.683343068 podStartE2EDuration="2.09670019s" podCreationTimestamp="2026-03-21 04:02:39 +0000 UTC" firstStartedPulling="2026-03-21 04:02:40.222557877 +0000 UTC m=+992.699626669" lastFinishedPulling="2026-03-21 04:02:40.635914999 +0000 UTC m=+993.112983791" observedRunningTime="2026-03-21 04:02:41.09599926 +0000 UTC m=+993.573068062" watchObservedRunningTime="2026-03-21 04:02:41.09670019 +0000 UTC m=+993.573769002" Mar 21 04:02:41 crc kubenswrapper[4685]: I0321 04:02:41.113664 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-qpvg8"] Mar 21 04:02:41 crc kubenswrapper[4685]: I0321 04:02:41.121268 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-qpvg8"] Mar 21 04:02:41 crc kubenswrapper[4685]: I0321 04:02:41.848190 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="barbican-kuttl-tests/memcached-0" Mar 21 04:02:42 crc kubenswrapper[4685]: E0321 04:02:42.027692 4685 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.158:51916->38.102.83.158:42143: write tcp 38.102.83.158:51916->38.102.83.158:42143: write: broken pipe Mar 21 04:02:42 crc kubenswrapper[4685]: I0321 04:02:42.310153 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67f7d3a9-f1d6-4c73-a95f-8783abe85339" path="/var/lib/kubelet/pods/67f7d3a9-f1d6-4c73-a95f-8783abe85339/volumes" Mar 21 04:02:45 crc kubenswrapper[4685]: I0321 04:02:45.170091 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="barbican-kuttl-tests/openstack-galera-0" Mar 21 04:02:45 crc kubenswrapper[4685]: I0321 04:02:45.170621 4685 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="barbican-kuttl-tests/openstack-galera-0" Mar 21 04:02:45 crc kubenswrapper[4685]: I0321 04:02:45.191623 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="barbican-kuttl-tests/openstack-galera-2" Mar 21 04:02:45 crc kubenswrapper[4685]: I0321 04:02:45.191960 4685 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="barbican-kuttl-tests/openstack-galera-2" Mar 21 04:02:45 crc kubenswrapper[4685]: I0321 04:02:45.198611 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="barbican-kuttl-tests/openstack-galera-1" Mar 21 04:02:45 crc kubenswrapper[4685]: I0321 04:02:45.198664 4685 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="barbican-kuttl-tests/openstack-galera-1" Mar 21 04:02:47 crc kubenswrapper[4685]: I0321 04:02:47.769146 4685 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="barbican-kuttl-tests/openstack-galera-2" Mar 21 04:02:47 crc kubenswrapper[4685]: I0321 04:02:47.833665 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="barbican-kuttl-tests/openstack-galera-2" Mar 21 04:02:49 crc kubenswrapper[4685]: I0321 04:02:49.764425 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/rabbitmq-cluster-operator-index-vl4k5" Mar 21 04:02:49 crc kubenswrapper[4685]: I0321 04:02:49.764726 4685 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/rabbitmq-cluster-operator-index-vl4k5" Mar 21 04:02:49 crc kubenswrapper[4685]: I0321 04:02:49.792712 4685 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/rabbitmq-cluster-operator-index-vl4k5" Mar 21 04:02:50 crc kubenswrapper[4685]: I0321 04:02:50.138513 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/rabbitmq-cluster-operator-index-vl4k5" Mar 21 04:02:52 crc kubenswrapper[4685]: I0321 04:02:52.082965 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qzckr"] Mar 21 04:02:52 crc kubenswrapper[4685]: E0321 04:02:52.083869 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67f7d3a9-f1d6-4c73-a95f-8783abe85339" containerName="registry-server" Mar 21 04:02:52 crc kubenswrapper[4685]: I0321 04:02:52.083886 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="67f7d3a9-f1d6-4c73-a95f-8783abe85339" containerName="registry-server" Mar 21 04:02:52 crc kubenswrapper[4685]: I0321 04:02:52.084030 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="67f7d3a9-f1d6-4c73-a95f-8783abe85339" containerName="registry-server" Mar 21 04:02:52 crc kubenswrapper[4685]: I0321 04:02:52.084998 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qzckr" Mar 21 04:02:52 crc kubenswrapper[4685]: I0321 04:02:52.087042 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-2vwsz" Mar 21 04:02:52 crc kubenswrapper[4685]: I0321 04:02:52.096866 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qzckr"] Mar 21 04:02:52 crc kubenswrapper[4685]: I0321 04:02:52.262326 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b4b6e003-448d-437f-805e-5dd92c8ea2aa-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qzckr\" (UID: \"b4b6e003-448d-437f-805e-5dd92c8ea2aa\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qzckr" Mar 21 04:02:52 crc kubenswrapper[4685]: I0321 04:02:52.262412 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmv5q\" (UniqueName: \"kubernetes.io/projected/b4b6e003-448d-437f-805e-5dd92c8ea2aa-kube-api-access-vmv5q\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qzckr\" (UID: \"b4b6e003-448d-437f-805e-5dd92c8ea2aa\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qzckr" Mar 21 04:02:52 crc kubenswrapper[4685]: I0321 04:02:52.262453 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b4b6e003-448d-437f-805e-5dd92c8ea2aa-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qzckr\" (UID: \"b4b6e003-448d-437f-805e-5dd92c8ea2aa\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qzckr" Mar 21 04:02:52 crc kubenswrapper[4685]: I0321 04:02:52.363570 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b4b6e003-448d-437f-805e-5dd92c8ea2aa-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qzckr\" (UID: \"b4b6e003-448d-437f-805e-5dd92c8ea2aa\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qzckr" Mar 21 04:02:52 crc kubenswrapper[4685]: I0321 04:02:52.364027 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmv5q\" (UniqueName: \"kubernetes.io/projected/b4b6e003-448d-437f-805e-5dd92c8ea2aa-kube-api-access-vmv5q\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qzckr\" (UID: \"b4b6e003-448d-437f-805e-5dd92c8ea2aa\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qzckr" Mar 21 04:02:52 crc kubenswrapper[4685]: I0321 04:02:52.364066 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b4b6e003-448d-437f-805e-5dd92c8ea2aa-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qzckr\" (UID: \"b4b6e003-448d-437f-805e-5dd92c8ea2aa\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qzckr" Mar 21 04:02:52 crc kubenswrapper[4685]: I0321 04:02:52.364228 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b4b6e003-448d-437f-805e-5dd92c8ea2aa-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qzckr\" (UID: \"b4b6e003-448d-437f-805e-5dd92c8ea2aa\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qzckr" Mar 21 04:02:52 crc kubenswrapper[4685]: I0321 04:02:52.364482 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b4b6e003-448d-437f-805e-5dd92c8ea2aa-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qzckr\" (UID: \"b4b6e003-448d-437f-805e-5dd92c8ea2aa\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qzckr" Mar 21 04:02:52 crc kubenswrapper[4685]: I0321 04:02:52.382700 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmv5q\" (UniqueName: \"kubernetes.io/projected/b4b6e003-448d-437f-805e-5dd92c8ea2aa-kube-api-access-vmv5q\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qzckr\" (UID: \"b4b6e003-448d-437f-805e-5dd92c8ea2aa\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qzckr" Mar 21 04:02:52 crc kubenswrapper[4685]: I0321 04:02:52.405317 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qzckr" Mar 21 04:02:53 crc kubenswrapper[4685]: I0321 04:02:53.882671 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qzckr"] Mar 21 04:02:53 crc kubenswrapper[4685]: I0321 04:02:53.913179 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/root-account-create-update-4cflz"] Mar 21 04:02:53 crc kubenswrapper[4685]: I0321 04:02:53.914195 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/root-account-create-update-4cflz" Mar 21 04:02:53 crc kubenswrapper[4685]: I0321 04:02:53.916392 4685 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"openstack-mariadb-root-db-secret" Mar 21 04:02:53 crc kubenswrapper[4685]: I0321 04:02:53.923913 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/root-account-create-update-4cflz"] Mar 21 04:02:54 crc kubenswrapper[4685]: I0321 04:02:54.094414 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38f93032-3939-4916-9b4d-addcd91de7f6-operator-scripts\") pod \"root-account-create-update-4cflz\" (UID: \"38f93032-3939-4916-9b4d-addcd91de7f6\") " pod="barbican-kuttl-tests/root-account-create-update-4cflz" Mar 21 04:02:54 crc kubenswrapper[4685]: I0321 04:02:54.094808 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85hs5\" (UniqueName: \"kubernetes.io/projected/38f93032-3939-4916-9b4d-addcd91de7f6-kube-api-access-85hs5\") pod \"root-account-create-update-4cflz\" (UID: \"38f93032-3939-4916-9b4d-addcd91de7f6\") " pod="barbican-kuttl-tests/root-account-create-update-4cflz" Mar 21 04:02:54 crc kubenswrapper[4685]: I0321 04:02:54.146326 4685 generic.go:334] "Generic (PLEG): container finished" podID="b4b6e003-448d-437f-805e-5dd92c8ea2aa" containerID="bd057d98b9b981c88d0cd15d750d56676000ca2649004b19bf75f9409685033f" exitCode=0 Mar 21 04:02:54 crc kubenswrapper[4685]: I0321 04:02:54.146367 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qzckr" event={"ID":"b4b6e003-448d-437f-805e-5dd92c8ea2aa","Type":"ContainerDied","Data":"bd057d98b9b981c88d0cd15d750d56676000ca2649004b19bf75f9409685033f"} Mar 21 04:02:54 crc kubenswrapper[4685]: I0321 04:02:54.146391 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qzckr" event={"ID":"b4b6e003-448d-437f-805e-5dd92c8ea2aa","Type":"ContainerStarted","Data":"cb5718ea25ac8d1bc42df829676f4397b6d52f42fd6f768d65f4befee7a9f14b"} Mar 21 04:02:54 crc kubenswrapper[4685]: I0321 04:02:54.196322 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38f93032-3939-4916-9b4d-addcd91de7f6-operator-scripts\") pod \"root-account-create-update-4cflz\" (UID: \"38f93032-3939-4916-9b4d-addcd91de7f6\") " pod="barbican-kuttl-tests/root-account-create-update-4cflz" Mar 21 04:02:54 crc kubenswrapper[4685]: I0321 04:02:54.196395 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85hs5\" (UniqueName: \"kubernetes.io/projected/38f93032-3939-4916-9b4d-addcd91de7f6-kube-api-access-85hs5\") pod \"root-account-create-update-4cflz\" (UID: \"38f93032-3939-4916-9b4d-addcd91de7f6\") " pod="barbican-kuttl-tests/root-account-create-update-4cflz" Mar 21 04:02:54 crc kubenswrapper[4685]: I0321 04:02:54.197150 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38f93032-3939-4916-9b4d-addcd91de7f6-operator-scripts\") pod \"root-account-create-update-4cflz\" (UID: \"38f93032-3939-4916-9b4d-addcd91de7f6\") " pod="barbican-kuttl-tests/root-account-create-update-4cflz" Mar 21 04:02:54 crc kubenswrapper[4685]: I0321 04:02:54.234102 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85hs5\" (UniqueName: \"kubernetes.io/projected/38f93032-3939-4916-9b4d-addcd91de7f6-kube-api-access-85hs5\") pod \"root-account-create-update-4cflz\" (UID: \"38f93032-3939-4916-9b4d-addcd91de7f6\") " pod="barbican-kuttl-tests/root-account-create-update-4cflz" Mar 21 04:02:54 crc kubenswrapper[4685]: I0321 04:02:54.245415 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/root-account-create-update-4cflz" Mar 21 04:02:54 crc kubenswrapper[4685]: I0321 04:02:54.691682 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/root-account-create-update-4cflz"] Mar 21 04:02:55 crc kubenswrapper[4685]: I0321 04:02:55.154014 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/root-account-create-update-4cflz" event={"ID":"38f93032-3939-4916-9b4d-addcd91de7f6","Type":"ContainerStarted","Data":"e0711fad77f15547cf76fbed59b0298329ab898dd1c5ba6eaa30e3a4802b5499"} Mar 21 04:02:55 crc kubenswrapper[4685]: I0321 04:02:55.154388 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/root-account-create-update-4cflz" event={"ID":"38f93032-3939-4916-9b4d-addcd91de7f6","Type":"ContainerStarted","Data":"4e73207e6b51009c35180858f56d62f8814b6bc2ca2248284a7c727b92fa8bf0"} Mar 21 04:02:55 crc kubenswrapper[4685]: I0321 04:02:55.155867 4685 generic.go:334] "Generic (PLEG): container finished" podID="b4b6e003-448d-437f-805e-5dd92c8ea2aa" containerID="1a79bccbf11405af6049a2b18f3f34e8ab927b7532b1d8a690ab6978034e65de" exitCode=0 Mar 21 04:02:55 crc kubenswrapper[4685]: I0321 04:02:55.155913 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qzckr" event={"ID":"b4b6e003-448d-437f-805e-5dd92c8ea2aa","Type":"ContainerDied","Data":"1a79bccbf11405af6049a2b18f3f34e8ab927b7532b1d8a690ab6978034e65de"} Mar 21 04:02:55 crc kubenswrapper[4685]: I0321 04:02:55.172626 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/root-account-create-update-4cflz" podStartSLOduration=2.172603705 podStartE2EDuration="2.172603705s" podCreationTimestamp="2026-03-21 04:02:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:02:55.169442974 +0000 UTC m=+1007.646511766" watchObservedRunningTime="2026-03-21 04:02:55.172603705 +0000 UTC m=+1007.649672497" Mar 21 04:02:55 crc kubenswrapper[4685]: I0321 04:02:55.278332 4685 prober.go:107] "Probe failed" probeType="Readiness" pod="barbican-kuttl-tests/openstack-galera-2" podUID="9ee610c4-8416-4d1c-a6b4-2324f1541b1c" containerName="galera" probeResult="failure" output=< Mar 21 04:02:55 crc kubenswrapper[4685]: wsrep_local_state_comment (Donor/Desynced) differs from Synced Mar 21 04:02:55 crc kubenswrapper[4685]: > Mar 21 04:02:56 crc kubenswrapper[4685]: I0321 04:02:56.164246 4685 generic.go:334] "Generic (PLEG): container finished" podID="b4b6e003-448d-437f-805e-5dd92c8ea2aa" containerID="afe8072d40f41f86cd8e9ca3876885870fa9135be27eb3bf9c6892869419732a" exitCode=0 Mar 21 04:02:56 crc kubenswrapper[4685]: I0321 04:02:56.164332 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qzckr" event={"ID":"b4b6e003-448d-437f-805e-5dd92c8ea2aa","Type":"ContainerDied","Data":"afe8072d40f41f86cd8e9ca3876885870fa9135be27eb3bf9c6892869419732a"} Mar 21 04:02:56 crc kubenswrapper[4685]: I0321 04:02:56.847781 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gqlv8"] Mar 21 04:02:56 crc kubenswrapper[4685]: I0321 04:02:56.849615 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gqlv8" Mar 21 04:02:56 crc kubenswrapper[4685]: I0321 04:02:56.863325 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gqlv8"] Mar 21 04:02:57 crc kubenswrapper[4685]: I0321 04:02:57.034419 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8730ef7e-146f-470d-acc9-fec7740ca406-utilities\") pod \"certified-operators-gqlv8\" (UID: \"8730ef7e-146f-470d-acc9-fec7740ca406\") " pod="openshift-marketplace/certified-operators-gqlv8" Mar 21 04:02:57 crc kubenswrapper[4685]: I0321 04:02:57.034594 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8730ef7e-146f-470d-acc9-fec7740ca406-catalog-content\") pod \"certified-operators-gqlv8\" (UID: \"8730ef7e-146f-470d-acc9-fec7740ca406\") " pod="openshift-marketplace/certified-operators-gqlv8" Mar 21 04:02:57 crc kubenswrapper[4685]: I0321 04:02:57.034729 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-498wr\" (UniqueName: \"kubernetes.io/projected/8730ef7e-146f-470d-acc9-fec7740ca406-kube-api-access-498wr\") pod \"certified-operators-gqlv8\" (UID: \"8730ef7e-146f-470d-acc9-fec7740ca406\") " pod="openshift-marketplace/certified-operators-gqlv8" Mar 21 04:02:57 crc kubenswrapper[4685]: I0321 04:02:57.135583 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8730ef7e-146f-470d-acc9-fec7740ca406-utilities\") pod \"certified-operators-gqlv8\" (UID: \"8730ef7e-146f-470d-acc9-fec7740ca406\") " pod="openshift-marketplace/certified-operators-gqlv8" Mar 21 04:02:57 crc kubenswrapper[4685]: I0321 04:02:57.135647 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8730ef7e-146f-470d-acc9-fec7740ca406-catalog-content\") pod \"certified-operators-gqlv8\" (UID: \"8730ef7e-146f-470d-acc9-fec7740ca406\") " pod="openshift-marketplace/certified-operators-gqlv8" Mar 21 04:02:57 crc kubenswrapper[4685]: I0321 04:02:57.135697 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-498wr\" (UniqueName: \"kubernetes.io/projected/8730ef7e-146f-470d-acc9-fec7740ca406-kube-api-access-498wr\") pod \"certified-operators-gqlv8\" (UID: \"8730ef7e-146f-470d-acc9-fec7740ca406\") " pod="openshift-marketplace/certified-operators-gqlv8" Mar 21 04:02:57 crc kubenswrapper[4685]: I0321 04:02:57.136093 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8730ef7e-146f-470d-acc9-fec7740ca406-utilities\") pod \"certified-operators-gqlv8\" (UID: \"8730ef7e-146f-470d-acc9-fec7740ca406\") " pod="openshift-marketplace/certified-operators-gqlv8" Mar 21 04:02:57 crc kubenswrapper[4685]: I0321 04:02:57.136165 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8730ef7e-146f-470d-acc9-fec7740ca406-catalog-content\") pod \"certified-operators-gqlv8\" (UID: \"8730ef7e-146f-470d-acc9-fec7740ca406\") " pod="openshift-marketplace/certified-operators-gqlv8" Mar 21 04:02:57 crc kubenswrapper[4685]: I0321 04:02:57.155720 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-498wr\" (UniqueName: \"kubernetes.io/projected/8730ef7e-146f-470d-acc9-fec7740ca406-kube-api-access-498wr\") pod \"certified-operators-gqlv8\" (UID: \"8730ef7e-146f-470d-acc9-fec7740ca406\") " pod="openshift-marketplace/certified-operators-gqlv8" Mar 21 04:02:57 crc kubenswrapper[4685]: I0321 04:02:57.171722 4685 generic.go:334] "Generic (PLEG): container finished" podID="38f93032-3939-4916-9b4d-addcd91de7f6" containerID="e0711fad77f15547cf76fbed59b0298329ab898dd1c5ba6eaa30e3a4802b5499" exitCode=0 Mar 21 04:02:57 crc kubenswrapper[4685]: I0321 04:02:57.171920 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/root-account-create-update-4cflz" event={"ID":"38f93032-3939-4916-9b4d-addcd91de7f6","Type":"ContainerDied","Data":"e0711fad77f15547cf76fbed59b0298329ab898dd1c5ba6eaa30e3a4802b5499"} Mar 21 04:02:57 crc kubenswrapper[4685]: I0321 04:02:57.182913 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gqlv8" Mar 21 04:02:57 crc kubenswrapper[4685]: I0321 04:02:57.519183 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qzckr" Mar 21 04:02:57 crc kubenswrapper[4685]: I0321 04:02:57.643215 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b4b6e003-448d-437f-805e-5dd92c8ea2aa-bundle\") pod \"b4b6e003-448d-437f-805e-5dd92c8ea2aa\" (UID: \"b4b6e003-448d-437f-805e-5dd92c8ea2aa\") " Mar 21 04:02:57 crc kubenswrapper[4685]: I0321 04:02:57.643307 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b4b6e003-448d-437f-805e-5dd92c8ea2aa-util\") pod \"b4b6e003-448d-437f-805e-5dd92c8ea2aa\" (UID: \"b4b6e003-448d-437f-805e-5dd92c8ea2aa\") " Mar 21 04:02:57 crc kubenswrapper[4685]: I0321 04:02:57.643377 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmv5q\" (UniqueName: \"kubernetes.io/projected/b4b6e003-448d-437f-805e-5dd92c8ea2aa-kube-api-access-vmv5q\") pod \"b4b6e003-448d-437f-805e-5dd92c8ea2aa\" (UID: \"b4b6e003-448d-437f-805e-5dd92c8ea2aa\") " Mar 21 04:02:57 crc kubenswrapper[4685]: I0321 04:02:57.643980 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4b6e003-448d-437f-805e-5dd92c8ea2aa-bundle" (OuterVolumeSpecName: "bundle") pod "b4b6e003-448d-437f-805e-5dd92c8ea2aa" (UID: "b4b6e003-448d-437f-805e-5dd92c8ea2aa"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:02:57 crc kubenswrapper[4685]: I0321 04:02:57.644485 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4b6e003-448d-437f-805e-5dd92c8ea2aa-util" (OuterVolumeSpecName: "util") pod "b4b6e003-448d-437f-805e-5dd92c8ea2aa" (UID: "b4b6e003-448d-437f-805e-5dd92c8ea2aa"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:02:57 crc kubenswrapper[4685]: I0321 04:02:57.648965 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4b6e003-448d-437f-805e-5dd92c8ea2aa-kube-api-access-vmv5q" (OuterVolumeSpecName: "kube-api-access-vmv5q") pod "b4b6e003-448d-437f-805e-5dd92c8ea2aa" (UID: "b4b6e003-448d-437f-805e-5dd92c8ea2aa"). InnerVolumeSpecName "kube-api-access-vmv5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:02:57 crc kubenswrapper[4685]: I0321 04:02:57.714478 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gqlv8"] Mar 21 04:02:57 crc kubenswrapper[4685]: W0321 04:02:57.722623 4685 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8730ef7e_146f_470d_acc9_fec7740ca406.slice/crio-f75fbc15b9c5a70c8d1e18aa11ab5597ca4f64c8d1d21de497ed493b43791909 WatchSource:0}: Error finding container f75fbc15b9c5a70c8d1e18aa11ab5597ca4f64c8d1d21de497ed493b43791909: Status 404 returned error can't find the container with id f75fbc15b9c5a70c8d1e18aa11ab5597ca4f64c8d1d21de497ed493b43791909 Mar 21 04:02:57 crc kubenswrapper[4685]: I0321 04:02:57.744436 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vmv5q\" (UniqueName: \"kubernetes.io/projected/b4b6e003-448d-437f-805e-5dd92c8ea2aa-kube-api-access-vmv5q\") on node \"crc\" DevicePath \"\"" Mar 21 04:02:57 crc kubenswrapper[4685]: I0321 04:02:57.744464 4685 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b4b6e003-448d-437f-805e-5dd92c8ea2aa-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:02:57 crc kubenswrapper[4685]: I0321 04:02:57.744473 4685 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b4b6e003-448d-437f-805e-5dd92c8ea2aa-util\") on node \"crc\" DevicePath \"\"" Mar 21 04:02:58 crc kubenswrapper[4685]: I0321 04:02:58.180403 4685 generic.go:334] "Generic (PLEG): container finished" podID="8730ef7e-146f-470d-acc9-fec7740ca406" containerID="a725919639018c1cc2e9e97f8c5a3907ca685209652fa9ed3a4961717e046f8c" exitCode=0 Mar 21 04:02:58 crc kubenswrapper[4685]: I0321 04:02:58.180459 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gqlv8" event={"ID":"8730ef7e-146f-470d-acc9-fec7740ca406","Type":"ContainerDied","Data":"a725919639018c1cc2e9e97f8c5a3907ca685209652fa9ed3a4961717e046f8c"} Mar 21 04:02:58 crc kubenswrapper[4685]: I0321 04:02:58.180750 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gqlv8" event={"ID":"8730ef7e-146f-470d-acc9-fec7740ca406","Type":"ContainerStarted","Data":"f75fbc15b9c5a70c8d1e18aa11ab5597ca4f64c8d1d21de497ed493b43791909"} Mar 21 04:02:58 crc kubenswrapper[4685]: I0321 04:02:58.184654 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qzckr" event={"ID":"b4b6e003-448d-437f-805e-5dd92c8ea2aa","Type":"ContainerDied","Data":"cb5718ea25ac8d1bc42df829676f4397b6d52f42fd6f768d65f4befee7a9f14b"} Mar 21 04:02:58 crc kubenswrapper[4685]: I0321 04:02:58.184689 4685 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb5718ea25ac8d1bc42df829676f4397b6d52f42fd6f768d65f4befee7a9f14b" Mar 21 04:02:58 crc kubenswrapper[4685]: I0321 04:02:58.184707 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qzckr" Mar 21 04:02:58 crc kubenswrapper[4685]: I0321 04:02:58.457185 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/root-account-create-update-4cflz" Mar 21 04:02:58 crc kubenswrapper[4685]: I0321 04:02:58.553281 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38f93032-3939-4916-9b4d-addcd91de7f6-operator-scripts\") pod \"38f93032-3939-4916-9b4d-addcd91de7f6\" (UID: \"38f93032-3939-4916-9b4d-addcd91de7f6\") " Mar 21 04:02:58 crc kubenswrapper[4685]: I0321 04:02:58.553680 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85hs5\" (UniqueName: \"kubernetes.io/projected/38f93032-3939-4916-9b4d-addcd91de7f6-kube-api-access-85hs5\") pod \"38f93032-3939-4916-9b4d-addcd91de7f6\" (UID: \"38f93032-3939-4916-9b4d-addcd91de7f6\") " Mar 21 04:02:58 crc kubenswrapper[4685]: I0321 04:02:58.554058 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38f93032-3939-4916-9b4d-addcd91de7f6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "38f93032-3939-4916-9b4d-addcd91de7f6" (UID: "38f93032-3939-4916-9b4d-addcd91de7f6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:02:58 crc kubenswrapper[4685]: I0321 04:02:58.563959 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38f93032-3939-4916-9b4d-addcd91de7f6-kube-api-access-85hs5" (OuterVolumeSpecName: "kube-api-access-85hs5") pod "38f93032-3939-4916-9b4d-addcd91de7f6" (UID: "38f93032-3939-4916-9b4d-addcd91de7f6"). InnerVolumeSpecName "kube-api-access-85hs5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:02:58 crc kubenswrapper[4685]: I0321 04:02:58.654950 4685 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38f93032-3939-4916-9b4d-addcd91de7f6-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:02:58 crc kubenswrapper[4685]: I0321 04:02:58.655184 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85hs5\" (UniqueName: \"kubernetes.io/projected/38f93032-3939-4916-9b4d-addcd91de7f6-kube-api-access-85hs5\") on node \"crc\" DevicePath \"\"" Mar 21 04:02:59 crc kubenswrapper[4685]: I0321 04:02:59.044698 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jdpdp"] Mar 21 04:02:59 crc kubenswrapper[4685]: E0321 04:02:59.045009 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38f93032-3939-4916-9b4d-addcd91de7f6" containerName="mariadb-account-create-update" Mar 21 04:02:59 crc kubenswrapper[4685]: I0321 04:02:59.045023 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="38f93032-3939-4916-9b4d-addcd91de7f6" containerName="mariadb-account-create-update" Mar 21 04:02:59 crc kubenswrapper[4685]: E0321 04:02:59.045042 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4b6e003-448d-437f-805e-5dd92c8ea2aa" containerName="extract" Mar 21 04:02:59 crc kubenswrapper[4685]: I0321 04:02:59.045051 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4b6e003-448d-437f-805e-5dd92c8ea2aa" containerName="extract" Mar 21 04:02:59 crc kubenswrapper[4685]: E0321 04:02:59.045069 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4b6e003-448d-437f-805e-5dd92c8ea2aa" containerName="pull" Mar 21 04:02:59 crc kubenswrapper[4685]: I0321 04:02:59.045079 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4b6e003-448d-437f-805e-5dd92c8ea2aa" containerName="pull" Mar 21 04:02:59 crc kubenswrapper[4685]: E0321 04:02:59.045092 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4b6e003-448d-437f-805e-5dd92c8ea2aa" containerName="util" Mar 21 04:02:59 crc kubenswrapper[4685]: I0321 04:02:59.045100 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4b6e003-448d-437f-805e-5dd92c8ea2aa" containerName="util" Mar 21 04:02:59 crc kubenswrapper[4685]: I0321 04:02:59.045248 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="38f93032-3939-4916-9b4d-addcd91de7f6" containerName="mariadb-account-create-update" Mar 21 04:02:59 crc kubenswrapper[4685]: I0321 04:02:59.045265 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4b6e003-448d-437f-805e-5dd92c8ea2aa" containerName="extract" Mar 21 04:02:59 crc kubenswrapper[4685]: I0321 04:02:59.046393 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jdpdp" Mar 21 04:02:59 crc kubenswrapper[4685]: I0321 04:02:59.055195 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jdpdp"] Mar 21 04:02:59 crc kubenswrapper[4685]: I0321 04:02:59.163272 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cl5l7\" (UniqueName: \"kubernetes.io/projected/2809b575-0d1a-4860-9b47-097651eb2dd1-kube-api-access-cl5l7\") pod \"redhat-operators-jdpdp\" (UID: \"2809b575-0d1a-4860-9b47-097651eb2dd1\") " pod="openshift-marketplace/redhat-operators-jdpdp" Mar 21 04:02:59 crc kubenswrapper[4685]: I0321 04:02:59.163326 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2809b575-0d1a-4860-9b47-097651eb2dd1-catalog-content\") pod \"redhat-operators-jdpdp\" (UID: \"2809b575-0d1a-4860-9b47-097651eb2dd1\") " pod="openshift-marketplace/redhat-operators-jdpdp" Mar 21 04:02:59 crc kubenswrapper[4685]: I0321 04:02:59.163367 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2809b575-0d1a-4860-9b47-097651eb2dd1-utilities\") pod \"redhat-operators-jdpdp\" (UID: \"2809b575-0d1a-4860-9b47-097651eb2dd1\") " pod="openshift-marketplace/redhat-operators-jdpdp" Mar 21 04:02:59 crc kubenswrapper[4685]: I0321 04:02:59.192628 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/root-account-create-update-4cflz" event={"ID":"38f93032-3939-4916-9b4d-addcd91de7f6","Type":"ContainerDied","Data":"4e73207e6b51009c35180858f56d62f8814b6bc2ca2248284a7c727b92fa8bf0"} Mar 21 04:02:59 crc kubenswrapper[4685]: I0321 04:02:59.192669 4685 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e73207e6b51009c35180858f56d62f8814b6bc2ca2248284a7c727b92fa8bf0" Mar 21 04:02:59 crc kubenswrapper[4685]: I0321 04:02:59.192722 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/root-account-create-update-4cflz" Mar 21 04:02:59 crc kubenswrapper[4685]: I0321 04:02:59.195478 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gqlv8" event={"ID":"8730ef7e-146f-470d-acc9-fec7740ca406","Type":"ContainerStarted","Data":"61881e05c19453fa5da44b3b1ce5291f40c374c212d7e903860e5ec7272f0818"} Mar 21 04:02:59 crc kubenswrapper[4685]: I0321 04:02:59.264937 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2809b575-0d1a-4860-9b47-097651eb2dd1-catalog-content\") pod \"redhat-operators-jdpdp\" (UID: \"2809b575-0d1a-4860-9b47-097651eb2dd1\") " pod="openshift-marketplace/redhat-operators-jdpdp" Mar 21 04:02:59 crc kubenswrapper[4685]: I0321 04:02:59.265016 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2809b575-0d1a-4860-9b47-097651eb2dd1-utilities\") pod \"redhat-operators-jdpdp\" (UID: \"2809b575-0d1a-4860-9b47-097651eb2dd1\") " pod="openshift-marketplace/redhat-operators-jdpdp" Mar 21 04:02:59 crc kubenswrapper[4685]: I0321 04:02:59.265109 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cl5l7\" (UniqueName: \"kubernetes.io/projected/2809b575-0d1a-4860-9b47-097651eb2dd1-kube-api-access-cl5l7\") pod \"redhat-operators-jdpdp\" (UID: \"2809b575-0d1a-4860-9b47-097651eb2dd1\") " pod="openshift-marketplace/redhat-operators-jdpdp" Mar 21 04:02:59 crc kubenswrapper[4685]: I0321 04:02:59.266298 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2809b575-0d1a-4860-9b47-097651eb2dd1-utilities\") pod \"redhat-operators-jdpdp\" (UID: \"2809b575-0d1a-4860-9b47-097651eb2dd1\") " pod="openshift-marketplace/redhat-operators-jdpdp" Mar 21 04:02:59 crc kubenswrapper[4685]: I0321 04:02:59.266811 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2809b575-0d1a-4860-9b47-097651eb2dd1-catalog-content\") pod \"redhat-operators-jdpdp\" (UID: \"2809b575-0d1a-4860-9b47-097651eb2dd1\") " pod="openshift-marketplace/redhat-operators-jdpdp" Mar 21 04:02:59 crc kubenswrapper[4685]: I0321 04:02:59.291017 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cl5l7\" (UniqueName: \"kubernetes.io/projected/2809b575-0d1a-4860-9b47-097651eb2dd1-kube-api-access-cl5l7\") pod \"redhat-operators-jdpdp\" (UID: \"2809b575-0d1a-4860-9b47-097651eb2dd1\") " pod="openshift-marketplace/redhat-operators-jdpdp" Mar 21 04:02:59 crc kubenswrapper[4685]: I0321 04:02:59.359471 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jdpdp" Mar 21 04:02:59 crc kubenswrapper[4685]: I0321 04:02:59.803358 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jdpdp"] Mar 21 04:02:59 crc kubenswrapper[4685]: W0321 04:02:59.806630 4685 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2809b575_0d1a_4860_9b47_097651eb2dd1.slice/crio-2f13645cfd9c4e6b4fec7716103ea05139eaf1e1e2fb420b2a22f9c52d535175 WatchSource:0}: Error finding container 2f13645cfd9c4e6b4fec7716103ea05139eaf1e1e2fb420b2a22f9c52d535175: Status 404 returned error can't find the container with id 2f13645cfd9c4e6b4fec7716103ea05139eaf1e1e2fb420b2a22f9c52d535175 Mar 21 04:03:00 crc kubenswrapper[4685]: I0321 04:03:00.202421 4685 generic.go:334] "Generic (PLEG): container finished" podID="8730ef7e-146f-470d-acc9-fec7740ca406" containerID="61881e05c19453fa5da44b3b1ce5291f40c374c212d7e903860e5ec7272f0818" exitCode=0 Mar 21 04:03:00 crc kubenswrapper[4685]: I0321 04:03:00.202484 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gqlv8" event={"ID":"8730ef7e-146f-470d-acc9-fec7740ca406","Type":"ContainerDied","Data":"61881e05c19453fa5da44b3b1ce5291f40c374c212d7e903860e5ec7272f0818"} Mar 21 04:03:00 crc kubenswrapper[4685]: I0321 04:03:00.204436 4685 generic.go:334] "Generic (PLEG): container finished" podID="2809b575-0d1a-4860-9b47-097651eb2dd1" containerID="5900adee8afe52b940f822a33957d6bca94d8fe117bc706f33beb88f781b4c36" exitCode=0 Mar 21 04:03:00 crc kubenswrapper[4685]: I0321 04:03:00.204469 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jdpdp" event={"ID":"2809b575-0d1a-4860-9b47-097651eb2dd1","Type":"ContainerDied","Data":"5900adee8afe52b940f822a33957d6bca94d8fe117bc706f33beb88f781b4c36"} Mar 21 04:03:00 crc kubenswrapper[4685]: I0321 04:03:00.204493 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jdpdp" event={"ID":"2809b575-0d1a-4860-9b47-097651eb2dd1","Type":"ContainerStarted","Data":"2f13645cfd9c4e6b4fec7716103ea05139eaf1e1e2fb420b2a22f9c52d535175"} Mar 21 04:03:00 crc kubenswrapper[4685]: I0321 04:03:00.502516 4685 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="barbican-kuttl-tests/openstack-galera-1" Mar 21 04:03:00 crc kubenswrapper[4685]: I0321 04:03:00.567751 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="barbican-kuttl-tests/openstack-galera-1" Mar 21 04:03:01 crc kubenswrapper[4685]: I0321 04:03:01.211464 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gqlv8" event={"ID":"8730ef7e-146f-470d-acc9-fec7740ca406","Type":"ContainerStarted","Data":"62fee6f6b1c6432834ce3153bcbf14ec28fdba91aa1da8b338acb0190f43aa78"} Mar 21 04:03:01 crc kubenswrapper[4685]: I0321 04:03:01.213475 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jdpdp" event={"ID":"2809b575-0d1a-4860-9b47-097651eb2dd1","Type":"ContainerStarted","Data":"b38971fe97b98a35f068b168051c47239fcab1cb731b9d255034b18ba2623b6e"} Mar 21 04:03:01 crc kubenswrapper[4685]: I0321 04:03:01.234028 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gqlv8" podStartSLOduration=2.714336948 podStartE2EDuration="5.234007046s" podCreationTimestamp="2026-03-21 04:02:56 +0000 UTC" firstStartedPulling="2026-03-21 04:02:58.182312329 +0000 UTC m=+1010.659381121" lastFinishedPulling="2026-03-21 04:03:00.701982427 +0000 UTC m=+1013.179051219" observedRunningTime="2026-03-21 04:03:01.228729514 +0000 UTC m=+1013.705798316" watchObservedRunningTime="2026-03-21 04:03:01.234007046 +0000 UTC m=+1013.711075848" Mar 21 04:03:01 crc kubenswrapper[4685]: E0321 04:03:01.746986 4685 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2809b575_0d1a_4860_9b47_097651eb2dd1.slice/crio-conmon-b38971fe97b98a35f068b168051c47239fcab1cb731b9d255034b18ba2623b6e.scope\": RecentStats: unable to find data in memory cache]" Mar 21 04:03:02 crc kubenswrapper[4685]: I0321 04:03:02.050419 4685 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="barbican-kuttl-tests/openstack-galera-0" Mar 21 04:03:02 crc kubenswrapper[4685]: I0321 04:03:02.139702 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="barbican-kuttl-tests/openstack-galera-0" Mar 21 04:03:02 crc kubenswrapper[4685]: I0321 04:03:02.220865 4685 generic.go:334] "Generic (PLEG): container finished" podID="2809b575-0d1a-4860-9b47-097651eb2dd1" containerID="b38971fe97b98a35f068b168051c47239fcab1cb731b9d255034b18ba2623b6e" exitCode=0 Mar 21 04:03:02 crc kubenswrapper[4685]: I0321 04:03:02.220972 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jdpdp" event={"ID":"2809b575-0d1a-4860-9b47-097651eb2dd1","Type":"ContainerDied","Data":"b38971fe97b98a35f068b168051c47239fcab1cb731b9d255034b18ba2623b6e"} Mar 21 04:03:03 crc kubenswrapper[4685]: I0321 04:03:03.233248 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jdpdp" event={"ID":"2809b575-0d1a-4860-9b47-097651eb2dd1","Type":"ContainerStarted","Data":"f54453561fc6d4cb8ff0d381b7280d051ecf949d7272fee52fccdc3c37f0e5f8"} Mar 21 04:03:03 crc kubenswrapper[4685]: I0321 04:03:03.258334 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jdpdp" podStartSLOduration=1.5913037399999999 podStartE2EDuration="4.258315215s" podCreationTimestamp="2026-03-21 04:02:59 +0000 UTC" firstStartedPulling="2026-03-21 04:03:00.205896226 +0000 UTC m=+1012.682965018" lastFinishedPulling="2026-03-21 04:03:02.872907701 +0000 UTC m=+1015.349976493" observedRunningTime="2026-03-21 04:03:03.253218218 +0000 UTC m=+1015.730287030" watchObservedRunningTime="2026-03-21 04:03:03.258315215 +0000 UTC m=+1015.735384007" Mar 21 04:03:07 crc kubenswrapper[4685]: I0321 04:03:07.183315 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gqlv8" Mar 21 04:03:07 crc kubenswrapper[4685]: I0321 04:03:07.183681 4685 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gqlv8" Mar 21 04:03:07 crc kubenswrapper[4685]: I0321 04:03:07.228767 4685 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gqlv8" Mar 21 04:03:07 crc kubenswrapper[4685]: I0321 04:03:07.299830 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gqlv8" Mar 21 04:03:09 crc kubenswrapper[4685]: I0321 04:03:09.172148 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-tf44w"] Mar 21 04:03:09 crc kubenswrapper[4685]: I0321 04:03:09.172879 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-tf44w" Mar 21 04:03:09 crc kubenswrapper[4685]: I0321 04:03:09.174420 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-dockercfg-mc2kb" Mar 21 04:03:09 crc kubenswrapper[4685]: I0321 04:03:09.184214 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-tf44w"] Mar 21 04:03:09 crc kubenswrapper[4685]: I0321 04:03:09.296496 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rr5j8\" (UniqueName: \"kubernetes.io/projected/5397a9a1-b670-42a8-8515-8cf15e8aa2d4-kube-api-access-rr5j8\") pod \"rabbitmq-cluster-operator-779fc9694b-tf44w\" (UID: \"5397a9a1-b670-42a8-8515-8cf15e8aa2d4\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-tf44w" Mar 21 04:03:09 crc kubenswrapper[4685]: I0321 04:03:09.359855 4685 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jdpdp" Mar 21 04:03:09 crc kubenswrapper[4685]: I0321 04:03:09.359920 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jdpdp" Mar 21 04:03:09 crc kubenswrapper[4685]: I0321 04:03:09.398608 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rr5j8\" (UniqueName: \"kubernetes.io/projected/5397a9a1-b670-42a8-8515-8cf15e8aa2d4-kube-api-access-rr5j8\") pod \"rabbitmq-cluster-operator-779fc9694b-tf44w\" (UID: \"5397a9a1-b670-42a8-8515-8cf15e8aa2d4\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-tf44w" Mar 21 04:03:09 crc kubenswrapper[4685]: I0321 04:03:09.418084 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rr5j8\" (UniqueName: \"kubernetes.io/projected/5397a9a1-b670-42a8-8515-8cf15e8aa2d4-kube-api-access-rr5j8\") pod \"rabbitmq-cluster-operator-779fc9694b-tf44w\" (UID: \"5397a9a1-b670-42a8-8515-8cf15e8aa2d4\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-tf44w" Mar 21 04:03:09 crc kubenswrapper[4685]: I0321 04:03:09.486932 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-tf44w" Mar 21 04:03:09 crc kubenswrapper[4685]: I0321 04:03:09.685183 4685 patch_prober.go:28] interesting pod/machine-config-daemon-7r9cg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 04:03:09 crc kubenswrapper[4685]: I0321 04:03:09.685555 4685 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" podUID="cea46fe2-4e41-43ab-a069-cb30fb4e732c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 04:03:09 crc kubenswrapper[4685]: I0321 04:03:09.685604 4685 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" Mar 21 04:03:09 crc kubenswrapper[4685]: I0321 04:03:09.686152 4685 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ac4ffd676ad57605265aed5caa44cae8130cfde3468685b94b3265e3fc4a39a0"} pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 21 04:03:09 crc kubenswrapper[4685]: I0321 04:03:09.686194 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" podUID="cea46fe2-4e41-43ab-a069-cb30fb4e732c" containerName="machine-config-daemon" containerID="cri-o://ac4ffd676ad57605265aed5caa44cae8130cfde3468685b94b3265e3fc4a39a0" gracePeriod=600 Mar 21 04:03:09 crc kubenswrapper[4685]: I0321 04:03:09.954570 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-tf44w"] Mar 21 04:03:09 crc kubenswrapper[4685]: W0321 04:03:09.961511 4685 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod5397a9a1_b670_42a8_8515_8cf15e8aa2d4.slice/crio-e54c05fc1782ad427efdac278a923b7065f8e255ba178f5febcda6ad0aa3364b WatchSource:0}: Error finding container e54c05fc1782ad427efdac278a923b7065f8e255ba178f5febcda6ad0aa3364b: Status 404 returned error can't find the container with id e54c05fc1782ad427efdac278a923b7065f8e255ba178f5febcda6ad0aa3364b Mar 21 04:03:10 crc kubenswrapper[4685]: I0321 04:03:10.279139 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-tf44w" event={"ID":"5397a9a1-b670-42a8-8515-8cf15e8aa2d4","Type":"ContainerStarted","Data":"e54c05fc1782ad427efdac278a923b7065f8e255ba178f5febcda6ad0aa3364b"} Mar 21 04:03:10 crc kubenswrapper[4685]: I0321 04:03:10.281592 4685 generic.go:334] "Generic (PLEG): container finished" podID="cea46fe2-4e41-43ab-a069-cb30fb4e732c" containerID="ac4ffd676ad57605265aed5caa44cae8130cfde3468685b94b3265e3fc4a39a0" exitCode=0 Mar 21 04:03:10 crc kubenswrapper[4685]: I0321 04:03:10.281642 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" event={"ID":"cea46fe2-4e41-43ab-a069-cb30fb4e732c","Type":"ContainerDied","Data":"ac4ffd676ad57605265aed5caa44cae8130cfde3468685b94b3265e3fc4a39a0"} Mar 21 04:03:10 crc kubenswrapper[4685]: I0321 04:03:10.281682 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" event={"ID":"cea46fe2-4e41-43ab-a069-cb30fb4e732c","Type":"ContainerStarted","Data":"fc184a6e763e19dcb85e7464f3adc7dbb9e9291d839d9e13c38f6aed20771d12"} Mar 21 04:03:10 crc kubenswrapper[4685]: I0321 04:03:10.281708 4685 scope.go:117] "RemoveContainer" containerID="da8be442c3ea2f96e685bee081e96f02736707ffa414186cd8dedbc178b8c1c5" Mar 21 04:03:10 crc kubenswrapper[4685]: I0321 04:03:10.400311 4685 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jdpdp" podUID="2809b575-0d1a-4860-9b47-097651eb2dd1" containerName="registry-server" probeResult="failure" output=< Mar 21 04:03:10 crc kubenswrapper[4685]: timeout: failed to connect service ":50051" within 1s Mar 21 04:03:10 crc kubenswrapper[4685]: > Mar 21 04:03:11 crc kubenswrapper[4685]: I0321 04:03:11.637322 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gqlv8"] Mar 21 04:03:11 crc kubenswrapper[4685]: I0321 04:03:11.637930 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-gqlv8" podUID="8730ef7e-146f-470d-acc9-fec7740ca406" containerName="registry-server" containerID="cri-o://62fee6f6b1c6432834ce3153bcbf14ec28fdba91aa1da8b338acb0190f43aa78" gracePeriod=2 Mar 21 04:03:13 crc kubenswrapper[4685]: I0321 04:03:13.308245 4685 generic.go:334] "Generic (PLEG): container finished" podID="8730ef7e-146f-470d-acc9-fec7740ca406" containerID="62fee6f6b1c6432834ce3153bcbf14ec28fdba91aa1da8b338acb0190f43aa78" exitCode=0 Mar 21 04:03:13 crc kubenswrapper[4685]: I0321 04:03:13.308426 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gqlv8" event={"ID":"8730ef7e-146f-470d-acc9-fec7740ca406","Type":"ContainerDied","Data":"62fee6f6b1c6432834ce3153bcbf14ec28fdba91aa1da8b338acb0190f43aa78"} Mar 21 04:03:14 crc kubenswrapper[4685]: I0321 04:03:14.191404 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gqlv8" Mar 21 04:03:14 crc kubenswrapper[4685]: I0321 04:03:14.316533 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gqlv8" event={"ID":"8730ef7e-146f-470d-acc9-fec7740ca406","Type":"ContainerDied","Data":"f75fbc15b9c5a70c8d1e18aa11ab5597ca4f64c8d1d21de497ed493b43791909"} Mar 21 04:03:14 crc kubenswrapper[4685]: I0321 04:03:14.316582 4685 scope.go:117] "RemoveContainer" containerID="62fee6f6b1c6432834ce3153bcbf14ec28fdba91aa1da8b338acb0190f43aa78" Mar 21 04:03:14 crc kubenswrapper[4685]: I0321 04:03:14.316595 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gqlv8" Mar 21 04:03:14 crc kubenswrapper[4685]: I0321 04:03:14.330403 4685 scope.go:117] "RemoveContainer" containerID="61881e05c19453fa5da44b3b1ce5291f40c374c212d7e903860e5ec7272f0818" Mar 21 04:03:14 crc kubenswrapper[4685]: I0321 04:03:14.347892 4685 scope.go:117] "RemoveContainer" containerID="a725919639018c1cc2e9e97f8c5a3907ca685209652fa9ed3a4961717e046f8c" Mar 21 04:03:14 crc kubenswrapper[4685]: I0321 04:03:14.379381 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8730ef7e-146f-470d-acc9-fec7740ca406-utilities\") pod \"8730ef7e-146f-470d-acc9-fec7740ca406\" (UID: \"8730ef7e-146f-470d-acc9-fec7740ca406\") " Mar 21 04:03:14 crc kubenswrapper[4685]: I0321 04:03:14.379740 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-498wr\" (UniqueName: \"kubernetes.io/projected/8730ef7e-146f-470d-acc9-fec7740ca406-kube-api-access-498wr\") pod \"8730ef7e-146f-470d-acc9-fec7740ca406\" (UID: \"8730ef7e-146f-470d-acc9-fec7740ca406\") " Mar 21 04:03:14 crc kubenswrapper[4685]: I0321 04:03:14.379796 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8730ef7e-146f-470d-acc9-fec7740ca406-catalog-content\") pod \"8730ef7e-146f-470d-acc9-fec7740ca406\" (UID: \"8730ef7e-146f-470d-acc9-fec7740ca406\") " Mar 21 04:03:14 crc kubenswrapper[4685]: I0321 04:03:14.380587 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8730ef7e-146f-470d-acc9-fec7740ca406-utilities" (OuterVolumeSpecName: "utilities") pod "8730ef7e-146f-470d-acc9-fec7740ca406" (UID: "8730ef7e-146f-470d-acc9-fec7740ca406"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:03:14 crc kubenswrapper[4685]: I0321 04:03:14.387431 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8730ef7e-146f-470d-acc9-fec7740ca406-kube-api-access-498wr" (OuterVolumeSpecName: "kube-api-access-498wr") pod "8730ef7e-146f-470d-acc9-fec7740ca406" (UID: "8730ef7e-146f-470d-acc9-fec7740ca406"). InnerVolumeSpecName "kube-api-access-498wr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:03:14 crc kubenswrapper[4685]: I0321 04:03:14.429614 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8730ef7e-146f-470d-acc9-fec7740ca406-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8730ef7e-146f-470d-acc9-fec7740ca406" (UID: "8730ef7e-146f-470d-acc9-fec7740ca406"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:03:14 crc kubenswrapper[4685]: I0321 04:03:14.481127 4685 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8730ef7e-146f-470d-acc9-fec7740ca406-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 04:03:14 crc kubenswrapper[4685]: I0321 04:03:14.481159 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-498wr\" (UniqueName: \"kubernetes.io/projected/8730ef7e-146f-470d-acc9-fec7740ca406-kube-api-access-498wr\") on node \"crc\" DevicePath \"\"" Mar 21 04:03:14 crc kubenswrapper[4685]: I0321 04:03:14.481173 4685 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8730ef7e-146f-470d-acc9-fec7740ca406-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 04:03:14 crc kubenswrapper[4685]: I0321 04:03:14.659777 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gqlv8"] Mar 21 04:03:14 crc kubenswrapper[4685]: I0321 04:03:14.664081 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-gqlv8"] Mar 21 04:03:15 crc kubenswrapper[4685]: I0321 04:03:15.325148 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-tf44w" event={"ID":"5397a9a1-b670-42a8-8515-8cf15e8aa2d4","Type":"ContainerStarted","Data":"93ff44a4581786ab19bb098a9133e5ff804e505fdec4b90e75879f470a42909c"} Mar 21 04:03:15 crc kubenswrapper[4685]: I0321 04:03:15.348058 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-tf44w" podStartSLOduration=2.120523377 podStartE2EDuration="6.348031101s" podCreationTimestamp="2026-03-21 04:03:09 +0000 UTC" firstStartedPulling="2026-03-21 04:03:09.963204535 +0000 UTC m=+1022.440273327" lastFinishedPulling="2026-03-21 04:03:14.190712259 +0000 UTC m=+1026.667781051" observedRunningTime="2026-03-21 04:03:15.342166792 +0000 UTC m=+1027.819235604" watchObservedRunningTime="2026-03-21 04:03:15.348031101 +0000 UTC m=+1027.825099923" Mar 21 04:03:16 crc kubenswrapper[4685]: I0321 04:03:16.309718 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8730ef7e-146f-470d-acc9-fec7740ca406" path="/var/lib/kubelet/pods/8730ef7e-146f-470d-acc9-fec7740ca406/volumes" Mar 21 04:03:18 crc kubenswrapper[4685]: I0321 04:03:18.762544 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/rabbitmq-server-0"] Mar 21 04:03:18 crc kubenswrapper[4685]: E0321 04:03:18.763247 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8730ef7e-146f-470d-acc9-fec7740ca406" containerName="extract-content" Mar 21 04:03:18 crc kubenswrapper[4685]: I0321 04:03:18.763270 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="8730ef7e-146f-470d-acc9-fec7740ca406" containerName="extract-content" Mar 21 04:03:18 crc kubenswrapper[4685]: E0321 04:03:18.763295 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8730ef7e-146f-470d-acc9-fec7740ca406" containerName="registry-server" Mar 21 04:03:18 crc kubenswrapper[4685]: I0321 04:03:18.763306 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="8730ef7e-146f-470d-acc9-fec7740ca406" containerName="registry-server" Mar 21 04:03:18 crc kubenswrapper[4685]: E0321 04:03:18.763333 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8730ef7e-146f-470d-acc9-fec7740ca406" containerName="extract-utilities" Mar 21 04:03:18 crc kubenswrapper[4685]: I0321 04:03:18.763343 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="8730ef7e-146f-470d-acc9-fec7740ca406" containerName="extract-utilities" Mar 21 04:03:18 crc kubenswrapper[4685]: I0321 04:03:18.763520 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="8730ef7e-146f-470d-acc9-fec7740ca406" containerName="registry-server" Mar 21 04:03:18 crc kubenswrapper[4685]: I0321 04:03:18.764328 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/rabbitmq-server-0" Mar 21 04:03:18 crc kubenswrapper[4685]: I0321 04:03:18.769485 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"barbican-kuttl-tests"/"rabbitmq-plugins-conf" Mar 21 04:03:18 crc kubenswrapper[4685]: I0321 04:03:18.769790 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"barbican-kuttl-tests"/"rabbitmq-server-conf" Mar 21 04:03:18 crc kubenswrapper[4685]: I0321 04:03:18.769874 4685 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"rabbitmq-server-dockercfg-9pk9k" Mar 21 04:03:18 crc kubenswrapper[4685]: I0321 04:03:18.770816 4685 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"rabbitmq-default-user" Mar 21 04:03:18 crc kubenswrapper[4685]: I0321 04:03:18.771394 4685 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"rabbitmq-erlang-cookie" Mar 21 04:03:18 crc kubenswrapper[4685]: I0321 04:03:18.783924 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/rabbitmq-server-0"] Mar 21 04:03:18 crc kubenswrapper[4685]: I0321 04:03:18.936922 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgdkb\" (UniqueName: \"kubernetes.io/projected/d18693b0-ea2a-4795-a4de-15a379cc8490-kube-api-access-qgdkb\") pod \"rabbitmq-server-0\" (UID: \"d18693b0-ea2a-4795-a4de-15a379cc8490\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Mar 21 04:03:18 crc kubenswrapper[4685]: I0321 04:03:18.936989 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d18693b0-ea2a-4795-a4de-15a379cc8490-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"d18693b0-ea2a-4795-a4de-15a379cc8490\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Mar 21 04:03:18 crc kubenswrapper[4685]: I0321 04:03:18.937094 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d18693b0-ea2a-4795-a4de-15a379cc8490-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"d18693b0-ea2a-4795-a4de-15a379cc8490\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Mar 21 04:03:18 crc kubenswrapper[4685]: I0321 04:03:18.937162 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d18693b0-ea2a-4795-a4de-15a379cc8490-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"d18693b0-ea2a-4795-a4de-15a379cc8490\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Mar 21 04:03:18 crc kubenswrapper[4685]: I0321 04:03:18.937202 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d18693b0-ea2a-4795-a4de-15a379cc8490-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"d18693b0-ea2a-4795-a4de-15a379cc8490\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Mar 21 04:03:18 crc kubenswrapper[4685]: I0321 04:03:18.937256 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d18693b0-ea2a-4795-a4de-15a379cc8490-pod-info\") pod \"rabbitmq-server-0\" (UID: \"d18693b0-ea2a-4795-a4de-15a379cc8490\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Mar 21 04:03:18 crc kubenswrapper[4685]: I0321 04:03:18.937283 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d18693b0-ea2a-4795-a4de-15a379cc8490-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"d18693b0-ea2a-4795-a4de-15a379cc8490\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Mar 21 04:03:18 crc kubenswrapper[4685]: I0321 04:03:18.937375 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-cd7b19e4-2e04-41ca-a42e-dec51f7bbe94\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cd7b19e4-2e04-41ca-a42e-dec51f7bbe94\") pod \"rabbitmq-server-0\" (UID: \"d18693b0-ea2a-4795-a4de-15a379cc8490\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Mar 21 04:03:19 crc kubenswrapper[4685]: I0321 04:03:19.039047 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d18693b0-ea2a-4795-a4de-15a379cc8490-pod-info\") pod \"rabbitmq-server-0\" (UID: \"d18693b0-ea2a-4795-a4de-15a379cc8490\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Mar 21 04:03:19 crc kubenswrapper[4685]: I0321 04:03:19.039092 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d18693b0-ea2a-4795-a4de-15a379cc8490-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"d18693b0-ea2a-4795-a4de-15a379cc8490\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Mar 21 04:03:19 crc kubenswrapper[4685]: I0321 04:03:19.039149 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-cd7b19e4-2e04-41ca-a42e-dec51f7bbe94\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cd7b19e4-2e04-41ca-a42e-dec51f7bbe94\") pod \"rabbitmq-server-0\" (UID: \"d18693b0-ea2a-4795-a4de-15a379cc8490\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Mar 21 04:03:19 crc kubenswrapper[4685]: I0321 04:03:19.039615 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d18693b0-ea2a-4795-a4de-15a379cc8490-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"d18693b0-ea2a-4795-a4de-15a379cc8490\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Mar 21 04:03:19 crc kubenswrapper[4685]: I0321 04:03:19.039683 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgdkb\" (UniqueName: \"kubernetes.io/projected/d18693b0-ea2a-4795-a4de-15a379cc8490-kube-api-access-qgdkb\") pod \"rabbitmq-server-0\" (UID: \"d18693b0-ea2a-4795-a4de-15a379cc8490\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Mar 21 04:03:19 crc kubenswrapper[4685]: I0321 04:03:19.042972 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d18693b0-ea2a-4795-a4de-15a379cc8490-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"d18693b0-ea2a-4795-a4de-15a379cc8490\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Mar 21 04:03:19 crc kubenswrapper[4685]: I0321 04:03:19.043066 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d18693b0-ea2a-4795-a4de-15a379cc8490-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"d18693b0-ea2a-4795-a4de-15a379cc8490\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Mar 21 04:03:19 crc kubenswrapper[4685]: I0321 04:03:19.043101 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d18693b0-ea2a-4795-a4de-15a379cc8490-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"d18693b0-ea2a-4795-a4de-15a379cc8490\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Mar 21 04:03:19 crc kubenswrapper[4685]: I0321 04:03:19.043305 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d18693b0-ea2a-4795-a4de-15a379cc8490-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"d18693b0-ea2a-4795-a4de-15a379cc8490\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Mar 21 04:03:19 crc kubenswrapper[4685]: I0321 04:03:19.043635 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d18693b0-ea2a-4795-a4de-15a379cc8490-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"d18693b0-ea2a-4795-a4de-15a379cc8490\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Mar 21 04:03:19 crc kubenswrapper[4685]: I0321 04:03:19.044655 4685 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 21 04:03:19 crc kubenswrapper[4685]: I0321 04:03:19.044687 4685 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-cd7b19e4-2e04-41ca-a42e-dec51f7bbe94\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cd7b19e4-2e04-41ca-a42e-dec51f7bbe94\") pod \"rabbitmq-server-0\" (UID: \"d18693b0-ea2a-4795-a4de-15a379cc8490\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/84b25019069577b99273f352cf1df17de29cd407e25e840f5fcff3a1aa2bbb61/globalmount\"" pod="barbican-kuttl-tests/rabbitmq-server-0" Mar 21 04:03:19 crc kubenswrapper[4685]: I0321 04:03:19.045891 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d18693b0-ea2a-4795-a4de-15a379cc8490-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"d18693b0-ea2a-4795-a4de-15a379cc8490\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Mar 21 04:03:19 crc kubenswrapper[4685]: I0321 04:03:19.052985 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d18693b0-ea2a-4795-a4de-15a379cc8490-pod-info\") pod \"rabbitmq-server-0\" (UID: \"d18693b0-ea2a-4795-a4de-15a379cc8490\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Mar 21 04:03:19 crc kubenswrapper[4685]: I0321 04:03:19.053400 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d18693b0-ea2a-4795-a4de-15a379cc8490-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"d18693b0-ea2a-4795-a4de-15a379cc8490\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Mar 21 04:03:19 crc kubenswrapper[4685]: I0321 04:03:19.063494 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgdkb\" (UniqueName: \"kubernetes.io/projected/d18693b0-ea2a-4795-a4de-15a379cc8490-kube-api-access-qgdkb\") pod \"rabbitmq-server-0\" (UID: \"d18693b0-ea2a-4795-a4de-15a379cc8490\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Mar 21 04:03:19 crc kubenswrapper[4685]: I0321 04:03:19.070754 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d18693b0-ea2a-4795-a4de-15a379cc8490-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"d18693b0-ea2a-4795-a4de-15a379cc8490\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Mar 21 04:03:19 crc kubenswrapper[4685]: I0321 04:03:19.092202 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-cd7b19e4-2e04-41ca-a42e-dec51f7bbe94\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cd7b19e4-2e04-41ca-a42e-dec51f7bbe94\") pod \"rabbitmq-server-0\" (UID: \"d18693b0-ea2a-4795-a4de-15a379cc8490\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Mar 21 04:03:19 crc kubenswrapper[4685]: I0321 04:03:19.394710 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/rabbitmq-server-0" Mar 21 04:03:19 crc kubenswrapper[4685]: I0321 04:03:19.402567 4685 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jdpdp" Mar 21 04:03:19 crc kubenswrapper[4685]: I0321 04:03:19.440967 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jdpdp" Mar 21 04:03:19 crc kubenswrapper[4685]: I0321 04:03:19.822882 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/rabbitmq-server-0"] Mar 21 04:03:20 crc kubenswrapper[4685]: I0321 04:03:20.358742 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/rabbitmq-server-0" event={"ID":"d18693b0-ea2a-4795-a4de-15a379cc8490","Type":"ContainerStarted","Data":"2f5496e15c04ecc25ede871af6a6220aaba433c9942d07d8afbf0f7ad00c2d45"} Mar 21 04:03:21 crc kubenswrapper[4685]: I0321 04:03:21.446475 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-index-h4hc9"] Mar 21 04:03:21 crc kubenswrapper[4685]: I0321 04:03:21.447315 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-h4hc9" Mar 21 04:03:21 crc kubenswrapper[4685]: I0321 04:03:21.450327 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-index-dockercfg-6dtrz" Mar 21 04:03:21 crc kubenswrapper[4685]: I0321 04:03:21.454623 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-index-h4hc9"] Mar 21 04:03:21 crc kubenswrapper[4685]: I0321 04:03:21.578939 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgm47\" (UniqueName: \"kubernetes.io/projected/e92e8d0a-5e30-4648-b7b5-9b6040db75f0-kube-api-access-rgm47\") pod \"keystone-operator-index-h4hc9\" (UID: \"e92e8d0a-5e30-4648-b7b5-9b6040db75f0\") " pod="openstack-operators/keystone-operator-index-h4hc9" Mar 21 04:03:21 crc kubenswrapper[4685]: I0321 04:03:21.680463 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgm47\" (UniqueName: \"kubernetes.io/projected/e92e8d0a-5e30-4648-b7b5-9b6040db75f0-kube-api-access-rgm47\") pod \"keystone-operator-index-h4hc9\" (UID: \"e92e8d0a-5e30-4648-b7b5-9b6040db75f0\") " pod="openstack-operators/keystone-operator-index-h4hc9" Mar 21 04:03:21 crc kubenswrapper[4685]: I0321 04:03:21.699243 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgm47\" (UniqueName: \"kubernetes.io/projected/e92e8d0a-5e30-4648-b7b5-9b6040db75f0-kube-api-access-rgm47\") pod \"keystone-operator-index-h4hc9\" (UID: \"e92e8d0a-5e30-4648-b7b5-9b6040db75f0\") " pod="openstack-operators/keystone-operator-index-h4hc9" Mar 21 04:03:21 crc kubenswrapper[4685]: I0321 04:03:21.770532 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-h4hc9" Mar 21 04:03:22 crc kubenswrapper[4685]: I0321 04:03:22.219300 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-index-h4hc9"] Mar 21 04:03:23 crc kubenswrapper[4685]: W0321 04:03:23.941818 4685 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode92e8d0a_5e30_4648_b7b5_9b6040db75f0.slice/crio-4fa08531c0227dea6527b5d8c58c0efa2feb05317ccc9cd694eb6d7c9918d176 WatchSource:0}: Error finding container 4fa08531c0227dea6527b5d8c58c0efa2feb05317ccc9cd694eb6d7c9918d176: Status 404 returned error can't find the container with id 4fa08531c0227dea6527b5d8c58c0efa2feb05317ccc9cd694eb6d7c9918d176 Mar 21 04:03:24 crc kubenswrapper[4685]: I0321 04:03:24.393465 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-h4hc9" event={"ID":"e92e8d0a-5e30-4648-b7b5-9b6040db75f0","Type":"ContainerStarted","Data":"4fa08531c0227dea6527b5d8c58c0efa2feb05317ccc9cd694eb6d7c9918d176"} Mar 21 04:03:26 crc kubenswrapper[4685]: I0321 04:03:26.236017 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jdpdp"] Mar 21 04:03:26 crc kubenswrapper[4685]: I0321 04:03:26.236626 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jdpdp" podUID="2809b575-0d1a-4860-9b47-097651eb2dd1" containerName="registry-server" containerID="cri-o://f54453561fc6d4cb8ff0d381b7280d051ecf949d7272fee52fccdc3c37f0e5f8" gracePeriod=2 Mar 21 04:03:26 crc kubenswrapper[4685]: I0321 04:03:26.408739 4685 generic.go:334] "Generic (PLEG): container finished" podID="2809b575-0d1a-4860-9b47-097651eb2dd1" containerID="f54453561fc6d4cb8ff0d381b7280d051ecf949d7272fee52fccdc3c37f0e5f8" exitCode=0 Mar 21 04:03:26 crc kubenswrapper[4685]: I0321 04:03:26.408814 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jdpdp" event={"ID":"2809b575-0d1a-4860-9b47-097651eb2dd1","Type":"ContainerDied","Data":"f54453561fc6d4cb8ff0d381b7280d051ecf949d7272fee52fccdc3c37f0e5f8"} Mar 21 04:03:26 crc kubenswrapper[4685]: I0321 04:03:26.714666 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jdpdp" Mar 21 04:03:26 crc kubenswrapper[4685]: I0321 04:03:26.864431 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2809b575-0d1a-4860-9b47-097651eb2dd1-catalog-content\") pod \"2809b575-0d1a-4860-9b47-097651eb2dd1\" (UID: \"2809b575-0d1a-4860-9b47-097651eb2dd1\") " Mar 21 04:03:26 crc kubenswrapper[4685]: I0321 04:03:26.864536 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2809b575-0d1a-4860-9b47-097651eb2dd1-utilities\") pod \"2809b575-0d1a-4860-9b47-097651eb2dd1\" (UID: \"2809b575-0d1a-4860-9b47-097651eb2dd1\") " Mar 21 04:03:26 crc kubenswrapper[4685]: I0321 04:03:26.864560 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cl5l7\" (UniqueName: \"kubernetes.io/projected/2809b575-0d1a-4860-9b47-097651eb2dd1-kube-api-access-cl5l7\") pod \"2809b575-0d1a-4860-9b47-097651eb2dd1\" (UID: \"2809b575-0d1a-4860-9b47-097651eb2dd1\") " Mar 21 04:03:26 crc kubenswrapper[4685]: I0321 04:03:26.866250 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2809b575-0d1a-4860-9b47-097651eb2dd1-utilities" (OuterVolumeSpecName: "utilities") pod "2809b575-0d1a-4860-9b47-097651eb2dd1" (UID: "2809b575-0d1a-4860-9b47-097651eb2dd1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:03:26 crc kubenswrapper[4685]: I0321 04:03:26.871948 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2809b575-0d1a-4860-9b47-097651eb2dd1-kube-api-access-cl5l7" (OuterVolumeSpecName: "kube-api-access-cl5l7") pod "2809b575-0d1a-4860-9b47-097651eb2dd1" (UID: "2809b575-0d1a-4860-9b47-097651eb2dd1"). InnerVolumeSpecName "kube-api-access-cl5l7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:03:26 crc kubenswrapper[4685]: I0321 04:03:26.965937 4685 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2809b575-0d1a-4860-9b47-097651eb2dd1-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 04:03:26 crc kubenswrapper[4685]: I0321 04:03:26.965975 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cl5l7\" (UniqueName: \"kubernetes.io/projected/2809b575-0d1a-4860-9b47-097651eb2dd1-kube-api-access-cl5l7\") on node \"crc\" DevicePath \"\"" Mar 21 04:03:26 crc kubenswrapper[4685]: I0321 04:03:26.988856 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2809b575-0d1a-4860-9b47-097651eb2dd1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2809b575-0d1a-4860-9b47-097651eb2dd1" (UID: "2809b575-0d1a-4860-9b47-097651eb2dd1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:03:27 crc kubenswrapper[4685]: I0321 04:03:27.067446 4685 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2809b575-0d1a-4860-9b47-097651eb2dd1-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 04:03:27 crc kubenswrapper[4685]: I0321 04:03:27.417194 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-h4hc9" event={"ID":"e92e8d0a-5e30-4648-b7b5-9b6040db75f0","Type":"ContainerStarted","Data":"bd6b7473b6bccdaa548f8ab30e82f3ea62e21f216d4fe508e8351b54c92bc0cb"} Mar 21 04:03:27 crc kubenswrapper[4685]: I0321 04:03:27.419356 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jdpdp" event={"ID":"2809b575-0d1a-4860-9b47-097651eb2dd1","Type":"ContainerDied","Data":"2f13645cfd9c4e6b4fec7716103ea05139eaf1e1e2fb420b2a22f9c52d535175"} Mar 21 04:03:27 crc kubenswrapper[4685]: I0321 04:03:27.419405 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jdpdp" Mar 21 04:03:27 crc kubenswrapper[4685]: I0321 04:03:27.419469 4685 scope.go:117] "RemoveContainer" containerID="f54453561fc6d4cb8ff0d381b7280d051ecf949d7272fee52fccdc3c37f0e5f8" Mar 21 04:03:27 crc kubenswrapper[4685]: I0321 04:03:27.433794 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-index-h4hc9" podStartSLOduration=3.822133619 podStartE2EDuration="6.433771515s" podCreationTimestamp="2026-03-21 04:03:21 +0000 UTC" firstStartedPulling="2026-03-21 04:03:23.944825415 +0000 UTC m=+1036.421894207" lastFinishedPulling="2026-03-21 04:03:26.556463311 +0000 UTC m=+1039.033532103" observedRunningTime="2026-03-21 04:03:27.431254402 +0000 UTC m=+1039.908323204" watchObservedRunningTime="2026-03-21 04:03:27.433771515 +0000 UTC m=+1039.910840307" Mar 21 04:03:27 crc kubenswrapper[4685]: I0321 04:03:27.453010 4685 scope.go:117] "RemoveContainer" containerID="b38971fe97b98a35f068b168051c47239fcab1cb731b9d255034b18ba2623b6e" Mar 21 04:03:27 crc kubenswrapper[4685]: I0321 04:03:27.463456 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jdpdp"] Mar 21 04:03:27 crc kubenswrapper[4685]: I0321 04:03:27.475246 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jdpdp"] Mar 21 04:03:27 crc kubenswrapper[4685]: I0321 04:03:27.492472 4685 scope.go:117] "RemoveContainer" containerID="5900adee8afe52b940f822a33957d6bca94d8fe117bc706f33beb88f781b4c36" Mar 21 04:03:28 crc kubenswrapper[4685]: I0321 04:03:28.308045 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2809b575-0d1a-4860-9b47-097651eb2dd1" path="/var/lib/kubelet/pods/2809b575-0d1a-4860-9b47-097651eb2dd1/volumes" Mar 21 04:03:28 crc kubenswrapper[4685]: I0321 04:03:28.429062 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/rabbitmq-server-0" event={"ID":"d18693b0-ea2a-4795-a4de-15a379cc8490","Type":"ContainerStarted","Data":"0b5a6c171308ac60c554fab568debabcf1c313fb15eededa7eb38e4bb375eb1b"} Mar 21 04:03:31 crc kubenswrapper[4685]: I0321 04:03:31.772413 4685 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/keystone-operator-index-h4hc9" Mar 21 04:03:31 crc kubenswrapper[4685]: I0321 04:03:31.772792 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-index-h4hc9" Mar 21 04:03:31 crc kubenswrapper[4685]: I0321 04:03:31.818920 4685 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/keystone-operator-index-h4hc9" Mar 21 04:03:32 crc kubenswrapper[4685]: I0321 04:03:32.478169 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-index-h4hc9" Mar 21 04:03:34 crc kubenswrapper[4685]: I0321 04:03:34.884160 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/25b5da83b847401861089ca4a9b7591d95e0d659639006dce400c423edngstq"] Mar 21 04:03:34 crc kubenswrapper[4685]: E0321 04:03:34.884385 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2809b575-0d1a-4860-9b47-097651eb2dd1" containerName="registry-server" Mar 21 04:03:34 crc kubenswrapper[4685]: I0321 04:03:34.884396 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="2809b575-0d1a-4860-9b47-097651eb2dd1" containerName="registry-server" Mar 21 04:03:34 crc kubenswrapper[4685]: E0321 04:03:34.884414 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2809b575-0d1a-4860-9b47-097651eb2dd1" containerName="extract-utilities" Mar 21 04:03:34 crc kubenswrapper[4685]: I0321 04:03:34.884419 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="2809b575-0d1a-4860-9b47-097651eb2dd1" containerName="extract-utilities" Mar 21 04:03:34 crc kubenswrapper[4685]: E0321 04:03:34.884428 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2809b575-0d1a-4860-9b47-097651eb2dd1" containerName="extract-content" Mar 21 04:03:34 crc kubenswrapper[4685]: I0321 04:03:34.884434 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="2809b575-0d1a-4860-9b47-097651eb2dd1" containerName="extract-content" Mar 21 04:03:34 crc kubenswrapper[4685]: I0321 04:03:34.884529 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="2809b575-0d1a-4860-9b47-097651eb2dd1" containerName="registry-server" Mar 21 04:03:34 crc kubenswrapper[4685]: I0321 04:03:34.885290 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/25b5da83b847401861089ca4a9b7591d95e0d659639006dce400c423edngstq" Mar 21 04:03:34 crc kubenswrapper[4685]: I0321 04:03:34.888703 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-2vwsz" Mar 21 04:03:34 crc kubenswrapper[4685]: I0321 04:03:34.923554 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/25b5da83b847401861089ca4a9b7591d95e0d659639006dce400c423edngstq"] Mar 21 04:03:34 crc kubenswrapper[4685]: I0321 04:03:34.972107 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/261fe7a7-2f57-4959-b23a-0752118908c9-util\") pod \"25b5da83b847401861089ca4a9b7591d95e0d659639006dce400c423edngstq\" (UID: \"261fe7a7-2f57-4959-b23a-0752118908c9\") " pod="openstack-operators/25b5da83b847401861089ca4a9b7591d95e0d659639006dce400c423edngstq" Mar 21 04:03:34 crc kubenswrapper[4685]: I0321 04:03:34.972439 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/261fe7a7-2f57-4959-b23a-0752118908c9-bundle\") pod \"25b5da83b847401861089ca4a9b7591d95e0d659639006dce400c423edngstq\" (UID: \"261fe7a7-2f57-4959-b23a-0752118908c9\") " pod="openstack-operators/25b5da83b847401861089ca4a9b7591d95e0d659639006dce400c423edngstq" Mar 21 04:03:34 crc kubenswrapper[4685]: I0321 04:03:34.972473 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t75wq\" (UniqueName: \"kubernetes.io/projected/261fe7a7-2f57-4959-b23a-0752118908c9-kube-api-access-t75wq\") pod \"25b5da83b847401861089ca4a9b7591d95e0d659639006dce400c423edngstq\" (UID: \"261fe7a7-2f57-4959-b23a-0752118908c9\") " pod="openstack-operators/25b5da83b847401861089ca4a9b7591d95e0d659639006dce400c423edngstq" Mar 21 04:03:35 crc kubenswrapper[4685]: I0321 04:03:35.073831 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/261fe7a7-2f57-4959-b23a-0752118908c9-bundle\") pod \"25b5da83b847401861089ca4a9b7591d95e0d659639006dce400c423edngstq\" (UID: \"261fe7a7-2f57-4959-b23a-0752118908c9\") " pod="openstack-operators/25b5da83b847401861089ca4a9b7591d95e0d659639006dce400c423edngstq" Mar 21 04:03:35 crc kubenswrapper[4685]: I0321 04:03:35.074089 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t75wq\" (UniqueName: \"kubernetes.io/projected/261fe7a7-2f57-4959-b23a-0752118908c9-kube-api-access-t75wq\") pod \"25b5da83b847401861089ca4a9b7591d95e0d659639006dce400c423edngstq\" (UID: \"261fe7a7-2f57-4959-b23a-0752118908c9\") " pod="openstack-operators/25b5da83b847401861089ca4a9b7591d95e0d659639006dce400c423edngstq" Mar 21 04:03:35 crc kubenswrapper[4685]: I0321 04:03:35.074215 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/261fe7a7-2f57-4959-b23a-0752118908c9-util\") pod \"25b5da83b847401861089ca4a9b7591d95e0d659639006dce400c423edngstq\" (UID: \"261fe7a7-2f57-4959-b23a-0752118908c9\") " pod="openstack-operators/25b5da83b847401861089ca4a9b7591d95e0d659639006dce400c423edngstq" Mar 21 04:03:35 crc kubenswrapper[4685]: I0321 04:03:35.074426 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/261fe7a7-2f57-4959-b23a-0752118908c9-bundle\") pod \"25b5da83b847401861089ca4a9b7591d95e0d659639006dce400c423edngstq\" (UID: \"261fe7a7-2f57-4959-b23a-0752118908c9\") " pod="openstack-operators/25b5da83b847401861089ca4a9b7591d95e0d659639006dce400c423edngstq" Mar 21 04:03:35 crc kubenswrapper[4685]: I0321 04:03:35.074617 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/261fe7a7-2f57-4959-b23a-0752118908c9-util\") pod \"25b5da83b847401861089ca4a9b7591d95e0d659639006dce400c423edngstq\" (UID: \"261fe7a7-2f57-4959-b23a-0752118908c9\") " pod="openstack-operators/25b5da83b847401861089ca4a9b7591d95e0d659639006dce400c423edngstq" Mar 21 04:03:35 crc kubenswrapper[4685]: I0321 04:03:35.092714 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t75wq\" (UniqueName: \"kubernetes.io/projected/261fe7a7-2f57-4959-b23a-0752118908c9-kube-api-access-t75wq\") pod \"25b5da83b847401861089ca4a9b7591d95e0d659639006dce400c423edngstq\" (UID: \"261fe7a7-2f57-4959-b23a-0752118908c9\") " pod="openstack-operators/25b5da83b847401861089ca4a9b7591d95e0d659639006dce400c423edngstq" Mar 21 04:03:35 crc kubenswrapper[4685]: I0321 04:03:35.202493 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/25b5da83b847401861089ca4a9b7591d95e0d659639006dce400c423edngstq" Mar 21 04:03:35 crc kubenswrapper[4685]: I0321 04:03:35.646936 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/25b5da83b847401861089ca4a9b7591d95e0d659639006dce400c423edngstq"] Mar 21 04:03:35 crc kubenswrapper[4685]: W0321 04:03:35.661940 4685 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod261fe7a7_2f57_4959_b23a_0752118908c9.slice/crio-4900ef2f8b1bcbc225cf9be9793f6860f1c6f3b9725013ea9f76613a37baa6f0 WatchSource:0}: Error finding container 4900ef2f8b1bcbc225cf9be9793f6860f1c6f3b9725013ea9f76613a37baa6f0: Status 404 returned error can't find the container with id 4900ef2f8b1bcbc225cf9be9793f6860f1c6f3b9725013ea9f76613a37baa6f0 Mar 21 04:03:36 crc kubenswrapper[4685]: I0321 04:03:36.479110 4685 generic.go:334] "Generic (PLEG): container finished" podID="261fe7a7-2f57-4959-b23a-0752118908c9" containerID="62f4399fd4930f3ab0ae55d2fcc04d98910e30189611688a706f8f63a35fbae3" exitCode=0 Mar 21 04:03:36 crc kubenswrapper[4685]: I0321 04:03:36.479178 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/25b5da83b847401861089ca4a9b7591d95e0d659639006dce400c423edngstq" event={"ID":"261fe7a7-2f57-4959-b23a-0752118908c9","Type":"ContainerDied","Data":"62f4399fd4930f3ab0ae55d2fcc04d98910e30189611688a706f8f63a35fbae3"} Mar 21 04:03:36 crc kubenswrapper[4685]: I0321 04:03:36.479505 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/25b5da83b847401861089ca4a9b7591d95e0d659639006dce400c423edngstq" event={"ID":"261fe7a7-2f57-4959-b23a-0752118908c9","Type":"ContainerStarted","Data":"4900ef2f8b1bcbc225cf9be9793f6860f1c6f3b9725013ea9f76613a37baa6f0"} Mar 21 04:03:37 crc kubenswrapper[4685]: I0321 04:03:37.512161 4685 generic.go:334] "Generic (PLEG): container finished" podID="261fe7a7-2f57-4959-b23a-0752118908c9" containerID="e8000909b4890418d0431be2bda32cbf14a20b8de4e545501e17052be394ac67" exitCode=0 Mar 21 04:03:37 crc kubenswrapper[4685]: I0321 04:03:37.512258 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/25b5da83b847401861089ca4a9b7591d95e0d659639006dce400c423edngstq" event={"ID":"261fe7a7-2f57-4959-b23a-0752118908c9","Type":"ContainerDied","Data":"e8000909b4890418d0431be2bda32cbf14a20b8de4e545501e17052be394ac67"} Mar 21 04:03:38 crc kubenswrapper[4685]: I0321 04:03:38.521307 4685 generic.go:334] "Generic (PLEG): container finished" podID="261fe7a7-2f57-4959-b23a-0752118908c9" containerID="58e207a688cfbd25650183c8c86f302428d2c16db293af38dfde5313bac735c2" exitCode=0 Mar 21 04:03:38 crc kubenswrapper[4685]: I0321 04:03:38.521380 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/25b5da83b847401861089ca4a9b7591d95e0d659639006dce400c423edngstq" event={"ID":"261fe7a7-2f57-4959-b23a-0752118908c9","Type":"ContainerDied","Data":"58e207a688cfbd25650183c8c86f302428d2c16db293af38dfde5313bac735c2"} Mar 21 04:03:40 crc kubenswrapper[4685]: I0321 04:03:40.052013 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/25b5da83b847401861089ca4a9b7591d95e0d659639006dce400c423edngstq" Mar 21 04:03:40 crc kubenswrapper[4685]: I0321 04:03:40.139564 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t75wq\" (UniqueName: \"kubernetes.io/projected/261fe7a7-2f57-4959-b23a-0752118908c9-kube-api-access-t75wq\") pod \"261fe7a7-2f57-4959-b23a-0752118908c9\" (UID: \"261fe7a7-2f57-4959-b23a-0752118908c9\") " Mar 21 04:03:40 crc kubenswrapper[4685]: I0321 04:03:40.139718 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/261fe7a7-2f57-4959-b23a-0752118908c9-util\") pod \"261fe7a7-2f57-4959-b23a-0752118908c9\" (UID: \"261fe7a7-2f57-4959-b23a-0752118908c9\") " Mar 21 04:03:40 crc kubenswrapper[4685]: I0321 04:03:40.139758 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/261fe7a7-2f57-4959-b23a-0752118908c9-bundle\") pod \"261fe7a7-2f57-4959-b23a-0752118908c9\" (UID: \"261fe7a7-2f57-4959-b23a-0752118908c9\") " Mar 21 04:03:40 crc kubenswrapper[4685]: I0321 04:03:40.141131 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/261fe7a7-2f57-4959-b23a-0752118908c9-bundle" (OuterVolumeSpecName: "bundle") pod "261fe7a7-2f57-4959-b23a-0752118908c9" (UID: "261fe7a7-2f57-4959-b23a-0752118908c9"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:03:40 crc kubenswrapper[4685]: I0321 04:03:40.145087 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/261fe7a7-2f57-4959-b23a-0752118908c9-kube-api-access-t75wq" (OuterVolumeSpecName: "kube-api-access-t75wq") pod "261fe7a7-2f57-4959-b23a-0752118908c9" (UID: "261fe7a7-2f57-4959-b23a-0752118908c9"). InnerVolumeSpecName "kube-api-access-t75wq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:03:40 crc kubenswrapper[4685]: I0321 04:03:40.157828 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/261fe7a7-2f57-4959-b23a-0752118908c9-util" (OuterVolumeSpecName: "util") pod "261fe7a7-2f57-4959-b23a-0752118908c9" (UID: "261fe7a7-2f57-4959-b23a-0752118908c9"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:03:40 crc kubenswrapper[4685]: I0321 04:03:40.241757 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t75wq\" (UniqueName: \"kubernetes.io/projected/261fe7a7-2f57-4959-b23a-0752118908c9-kube-api-access-t75wq\") on node \"crc\" DevicePath \"\"" Mar 21 04:03:40 crc kubenswrapper[4685]: I0321 04:03:40.241795 4685 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/261fe7a7-2f57-4959-b23a-0752118908c9-util\") on node \"crc\" DevicePath \"\"" Mar 21 04:03:40 crc kubenswrapper[4685]: I0321 04:03:40.241807 4685 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/261fe7a7-2f57-4959-b23a-0752118908c9-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:03:40 crc kubenswrapper[4685]: I0321 04:03:40.547562 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/25b5da83b847401861089ca4a9b7591d95e0d659639006dce400c423edngstq" event={"ID":"261fe7a7-2f57-4959-b23a-0752118908c9","Type":"ContainerDied","Data":"4900ef2f8b1bcbc225cf9be9793f6860f1c6f3b9725013ea9f76613a37baa6f0"} Mar 21 04:03:40 crc kubenswrapper[4685]: I0321 04:03:40.547596 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/25b5da83b847401861089ca4a9b7591d95e0d659639006dce400c423edngstq" Mar 21 04:03:40 crc kubenswrapper[4685]: I0321 04:03:40.547622 4685 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4900ef2f8b1bcbc225cf9be9793f6860f1c6f3b9725013ea9f76613a37baa6f0" Mar 21 04:03:45 crc kubenswrapper[4685]: I0321 04:03:45.653644 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bcnrq"] Mar 21 04:03:45 crc kubenswrapper[4685]: E0321 04:03:45.654616 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="261fe7a7-2f57-4959-b23a-0752118908c9" containerName="pull" Mar 21 04:03:45 crc kubenswrapper[4685]: I0321 04:03:45.654634 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="261fe7a7-2f57-4959-b23a-0752118908c9" containerName="pull" Mar 21 04:03:45 crc kubenswrapper[4685]: E0321 04:03:45.654647 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="261fe7a7-2f57-4959-b23a-0752118908c9" containerName="util" Mar 21 04:03:45 crc kubenswrapper[4685]: I0321 04:03:45.654655 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="261fe7a7-2f57-4959-b23a-0752118908c9" containerName="util" Mar 21 04:03:45 crc kubenswrapper[4685]: E0321 04:03:45.654671 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="261fe7a7-2f57-4959-b23a-0752118908c9" containerName="extract" Mar 21 04:03:45 crc kubenswrapper[4685]: I0321 04:03:45.654680 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="261fe7a7-2f57-4959-b23a-0752118908c9" containerName="extract" Mar 21 04:03:45 crc kubenswrapper[4685]: I0321 04:03:45.654823 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="261fe7a7-2f57-4959-b23a-0752118908c9" containerName="extract" Mar 21 04:03:45 crc kubenswrapper[4685]: I0321 04:03:45.656000 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bcnrq" Mar 21 04:03:45 crc kubenswrapper[4685]: I0321 04:03:45.666718 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bcnrq"] Mar 21 04:03:45 crc kubenswrapper[4685]: I0321 04:03:45.824049 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fee79fb-651e-42fb-a60a-55319c4666a3-utilities\") pod \"community-operators-bcnrq\" (UID: \"1fee79fb-651e-42fb-a60a-55319c4666a3\") " pod="openshift-marketplace/community-operators-bcnrq" Mar 21 04:03:45 crc kubenswrapper[4685]: I0321 04:03:45.824104 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fee79fb-651e-42fb-a60a-55319c4666a3-catalog-content\") pod \"community-operators-bcnrq\" (UID: \"1fee79fb-651e-42fb-a60a-55319c4666a3\") " pod="openshift-marketplace/community-operators-bcnrq" Mar 21 04:03:45 crc kubenswrapper[4685]: I0321 04:03:45.824273 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hztkl\" (UniqueName: \"kubernetes.io/projected/1fee79fb-651e-42fb-a60a-55319c4666a3-kube-api-access-hztkl\") pod \"community-operators-bcnrq\" (UID: \"1fee79fb-651e-42fb-a60a-55319c4666a3\") " pod="openshift-marketplace/community-operators-bcnrq" Mar 21 04:03:45 crc kubenswrapper[4685]: I0321 04:03:45.926113 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hztkl\" (UniqueName: \"kubernetes.io/projected/1fee79fb-651e-42fb-a60a-55319c4666a3-kube-api-access-hztkl\") pod \"community-operators-bcnrq\" (UID: \"1fee79fb-651e-42fb-a60a-55319c4666a3\") " pod="openshift-marketplace/community-operators-bcnrq" Mar 21 04:03:45 crc kubenswrapper[4685]: I0321 04:03:45.926545 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fee79fb-651e-42fb-a60a-55319c4666a3-utilities\") pod \"community-operators-bcnrq\" (UID: \"1fee79fb-651e-42fb-a60a-55319c4666a3\") " pod="openshift-marketplace/community-operators-bcnrq" Mar 21 04:03:45 crc kubenswrapper[4685]: I0321 04:03:45.926608 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fee79fb-651e-42fb-a60a-55319c4666a3-catalog-content\") pod \"community-operators-bcnrq\" (UID: \"1fee79fb-651e-42fb-a60a-55319c4666a3\") " pod="openshift-marketplace/community-operators-bcnrq" Mar 21 04:03:45 crc kubenswrapper[4685]: I0321 04:03:45.927417 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fee79fb-651e-42fb-a60a-55319c4666a3-utilities\") pod \"community-operators-bcnrq\" (UID: \"1fee79fb-651e-42fb-a60a-55319c4666a3\") " pod="openshift-marketplace/community-operators-bcnrq" Mar 21 04:03:45 crc kubenswrapper[4685]: I0321 04:03:45.927505 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fee79fb-651e-42fb-a60a-55319c4666a3-catalog-content\") pod \"community-operators-bcnrq\" (UID: \"1fee79fb-651e-42fb-a60a-55319c4666a3\") " pod="openshift-marketplace/community-operators-bcnrq" Mar 21 04:03:45 crc kubenswrapper[4685]: I0321 04:03:45.948940 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hztkl\" (UniqueName: \"kubernetes.io/projected/1fee79fb-651e-42fb-a60a-55319c4666a3-kube-api-access-hztkl\") pod \"community-operators-bcnrq\" (UID: \"1fee79fb-651e-42fb-a60a-55319c4666a3\") " pod="openshift-marketplace/community-operators-bcnrq" Mar 21 04:03:45 crc kubenswrapper[4685]: I0321 04:03:45.980200 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bcnrq" Mar 21 04:03:46 crc kubenswrapper[4685]: I0321 04:03:46.395039 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bcnrq"] Mar 21 04:03:46 crc kubenswrapper[4685]: I0321 04:03:46.587340 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bcnrq" event={"ID":"1fee79fb-651e-42fb-a60a-55319c4666a3","Type":"ContainerStarted","Data":"5adb9b7cedf954468dece28a57c84279664499cc5b3731032809778606da7df3"} Mar 21 04:03:47 crc kubenswrapper[4685]: I0321 04:03:47.595363 4685 generic.go:334] "Generic (PLEG): container finished" podID="1fee79fb-651e-42fb-a60a-55319c4666a3" containerID="5f9b13843ce7cfe60bbca680a9a3b2a760a9118156fc8bc8be58f8433440e0f6" exitCode=0 Mar 21 04:03:47 crc kubenswrapper[4685]: I0321 04:03:47.595459 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bcnrq" event={"ID":"1fee79fb-651e-42fb-a60a-55319c4666a3","Type":"ContainerDied","Data":"5f9b13843ce7cfe60bbca680a9a3b2a760a9118156fc8bc8be58f8433440e0f6"} Mar 21 04:03:49 crc kubenswrapper[4685]: I0321 04:03:49.614305 4685 generic.go:334] "Generic (PLEG): container finished" podID="1fee79fb-651e-42fb-a60a-55319c4666a3" containerID="a5e78ba39e157c3f256ba210957b89e74ba910d6b58f14241d5fa9e214f88bd9" exitCode=0 Mar 21 04:03:49 crc kubenswrapper[4685]: I0321 04:03:49.614411 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bcnrq" event={"ID":"1fee79fb-651e-42fb-a60a-55319c4666a3","Type":"ContainerDied","Data":"a5e78ba39e157c3f256ba210957b89e74ba910d6b58f14241d5fa9e214f88bd9"} Mar 21 04:03:50 crc kubenswrapper[4685]: I0321 04:03:50.622237 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bcnrq" event={"ID":"1fee79fb-651e-42fb-a60a-55319c4666a3","Type":"ContainerStarted","Data":"089f0026a8c686205a2ed8e0b9efec14c6e581acad02ab00af33a41dc56d895b"} Mar 21 04:03:50 crc kubenswrapper[4685]: I0321 04:03:50.642235 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bcnrq" podStartSLOduration=2.853720006 podStartE2EDuration="5.64221592s" podCreationTimestamp="2026-03-21 04:03:45 +0000 UTC" firstStartedPulling="2026-03-21 04:03:47.598660728 +0000 UTC m=+1060.075729520" lastFinishedPulling="2026-03-21 04:03:50.387156642 +0000 UTC m=+1062.864225434" observedRunningTime="2026-03-21 04:03:50.639125261 +0000 UTC m=+1063.116194063" watchObservedRunningTime="2026-03-21 04:03:50.64221592 +0000 UTC m=+1063.119284712" Mar 21 04:03:52 crc kubenswrapper[4685]: I0321 04:03:52.222032 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5c456858cb-jnxfq"] Mar 21 04:03:52 crc kubenswrapper[4685]: I0321 04:03:52.224048 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-5c456858cb-jnxfq" Mar 21 04:03:52 crc kubenswrapper[4685]: I0321 04:03:52.226096 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-ddcsd" Mar 21 04:03:52 crc kubenswrapper[4685]: I0321 04:03:52.226908 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-service-cert" Mar 21 04:03:52 crc kubenswrapper[4685]: I0321 04:03:52.239963 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5c456858cb-jnxfq"] Mar 21 04:03:52 crc kubenswrapper[4685]: I0321 04:03:52.312916 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7e33f8cc-f6fa-48ab-a172-74892478c268-webhook-cert\") pod \"keystone-operator-controller-manager-5c456858cb-jnxfq\" (UID: \"7e33f8cc-f6fa-48ab-a172-74892478c268\") " pod="openstack-operators/keystone-operator-controller-manager-5c456858cb-jnxfq" Mar 21 04:03:52 crc kubenswrapper[4685]: I0321 04:03:52.313092 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7e33f8cc-f6fa-48ab-a172-74892478c268-apiservice-cert\") pod \"keystone-operator-controller-manager-5c456858cb-jnxfq\" (UID: \"7e33f8cc-f6fa-48ab-a172-74892478c268\") " pod="openstack-operators/keystone-operator-controller-manager-5c456858cb-jnxfq" Mar 21 04:03:52 crc kubenswrapper[4685]: I0321 04:03:52.313134 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4692\" (UniqueName: \"kubernetes.io/projected/7e33f8cc-f6fa-48ab-a172-74892478c268-kube-api-access-q4692\") pod \"keystone-operator-controller-manager-5c456858cb-jnxfq\" (UID: \"7e33f8cc-f6fa-48ab-a172-74892478c268\") " pod="openstack-operators/keystone-operator-controller-manager-5c456858cb-jnxfq" Mar 21 04:03:52 crc kubenswrapper[4685]: I0321 04:03:52.416919 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7e33f8cc-f6fa-48ab-a172-74892478c268-apiservice-cert\") pod \"keystone-operator-controller-manager-5c456858cb-jnxfq\" (UID: \"7e33f8cc-f6fa-48ab-a172-74892478c268\") " pod="openstack-operators/keystone-operator-controller-manager-5c456858cb-jnxfq" Mar 21 04:03:52 crc kubenswrapper[4685]: I0321 04:03:52.416980 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4692\" (UniqueName: \"kubernetes.io/projected/7e33f8cc-f6fa-48ab-a172-74892478c268-kube-api-access-q4692\") pod \"keystone-operator-controller-manager-5c456858cb-jnxfq\" (UID: \"7e33f8cc-f6fa-48ab-a172-74892478c268\") " pod="openstack-operators/keystone-operator-controller-manager-5c456858cb-jnxfq" Mar 21 04:03:52 crc kubenswrapper[4685]: I0321 04:03:52.417025 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7e33f8cc-f6fa-48ab-a172-74892478c268-webhook-cert\") pod \"keystone-operator-controller-manager-5c456858cb-jnxfq\" (UID: \"7e33f8cc-f6fa-48ab-a172-74892478c268\") " pod="openstack-operators/keystone-operator-controller-manager-5c456858cb-jnxfq" Mar 21 04:03:52 crc kubenswrapper[4685]: I0321 04:03:52.424170 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7e33f8cc-f6fa-48ab-a172-74892478c268-apiservice-cert\") pod \"keystone-operator-controller-manager-5c456858cb-jnxfq\" (UID: \"7e33f8cc-f6fa-48ab-a172-74892478c268\") " pod="openstack-operators/keystone-operator-controller-manager-5c456858cb-jnxfq" Mar 21 04:03:52 crc kubenswrapper[4685]: I0321 04:03:52.436826 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7e33f8cc-f6fa-48ab-a172-74892478c268-webhook-cert\") pod \"keystone-operator-controller-manager-5c456858cb-jnxfq\" (UID: \"7e33f8cc-f6fa-48ab-a172-74892478c268\") " pod="openstack-operators/keystone-operator-controller-manager-5c456858cb-jnxfq" Mar 21 04:03:52 crc kubenswrapper[4685]: I0321 04:03:52.440552 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4692\" (UniqueName: \"kubernetes.io/projected/7e33f8cc-f6fa-48ab-a172-74892478c268-kube-api-access-q4692\") pod \"keystone-operator-controller-manager-5c456858cb-jnxfq\" (UID: \"7e33f8cc-f6fa-48ab-a172-74892478c268\") " pod="openstack-operators/keystone-operator-controller-manager-5c456858cb-jnxfq" Mar 21 04:03:52 crc kubenswrapper[4685]: I0321 04:03:52.542495 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-5c456858cb-jnxfq" Mar 21 04:03:52 crc kubenswrapper[4685]: I0321 04:03:52.951434 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5c456858cb-jnxfq"] Mar 21 04:03:52 crc kubenswrapper[4685]: W0321 04:03:52.952824 4685 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e33f8cc_f6fa_48ab_a172_74892478c268.slice/crio-7aae4e911c8c43ac4d1f9e8baa7e39dfa7328d06acda028d8d2ce5a273801709 WatchSource:0}: Error finding container 7aae4e911c8c43ac4d1f9e8baa7e39dfa7328d06acda028d8d2ce5a273801709: Status 404 returned error can't find the container with id 7aae4e911c8c43ac4d1f9e8baa7e39dfa7328d06acda028d8d2ce5a273801709 Mar 21 04:03:53 crc kubenswrapper[4685]: I0321 04:03:53.647788 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5c456858cb-jnxfq" event={"ID":"7e33f8cc-f6fa-48ab-a172-74892478c268","Type":"ContainerStarted","Data":"7aae4e911c8c43ac4d1f9e8baa7e39dfa7328d06acda028d8d2ce5a273801709"} Mar 21 04:03:55 crc kubenswrapper[4685]: I0321 04:03:55.981136 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bcnrq" Mar 21 04:03:55 crc kubenswrapper[4685]: I0321 04:03:55.981508 4685 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bcnrq" Mar 21 04:03:56 crc kubenswrapper[4685]: I0321 04:03:56.022625 4685 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bcnrq" Mar 21 04:03:56 crc kubenswrapper[4685]: I0321 04:03:56.699732 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bcnrq" Mar 21 04:03:59 crc kubenswrapper[4685]: I0321 04:03:59.639447 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bcnrq"] Mar 21 04:03:59 crc kubenswrapper[4685]: I0321 04:03:59.640140 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-bcnrq" podUID="1fee79fb-651e-42fb-a60a-55319c4666a3" containerName="registry-server" containerID="cri-o://089f0026a8c686205a2ed8e0b9efec14c6e581acad02ab00af33a41dc56d895b" gracePeriod=2 Mar 21 04:03:59 crc kubenswrapper[4685]: I0321 04:03:59.703959 4685 generic.go:334] "Generic (PLEG): container finished" podID="d18693b0-ea2a-4795-a4de-15a379cc8490" containerID="0b5a6c171308ac60c554fab568debabcf1c313fb15eededa7eb38e4bb375eb1b" exitCode=0 Mar 21 04:03:59 crc kubenswrapper[4685]: I0321 04:03:59.704014 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/rabbitmq-server-0" event={"ID":"d18693b0-ea2a-4795-a4de-15a379cc8490","Type":"ContainerDied","Data":"0b5a6c171308ac60c554fab568debabcf1c313fb15eededa7eb38e4bb375eb1b"} Mar 21 04:04:00 crc kubenswrapper[4685]: I0321 04:04:00.130922 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567764-pqzcc"] Mar 21 04:04:00 crc kubenswrapper[4685]: I0321 04:04:00.132442 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567764-pqzcc" Mar 21 04:04:00 crc kubenswrapper[4685]: I0321 04:04:00.136544 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 04:04:00 crc kubenswrapper[4685]: I0321 04:04:00.136702 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 04:04:00 crc kubenswrapper[4685]: I0321 04:04:00.136711 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k75cc" Mar 21 04:04:00 crc kubenswrapper[4685]: I0321 04:04:00.139201 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567764-pqzcc"] Mar 21 04:04:00 crc kubenswrapper[4685]: I0321 04:04:00.257021 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lz2kx\" (UniqueName: \"kubernetes.io/projected/0f957d4c-d9e0-4c92-ac81-bd5bdab751ad-kube-api-access-lz2kx\") pod \"auto-csr-approver-29567764-pqzcc\" (UID: \"0f957d4c-d9e0-4c92-ac81-bd5bdab751ad\") " pod="openshift-infra/auto-csr-approver-29567764-pqzcc" Mar 21 04:04:00 crc kubenswrapper[4685]: I0321 04:04:00.358754 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lz2kx\" (UniqueName: \"kubernetes.io/projected/0f957d4c-d9e0-4c92-ac81-bd5bdab751ad-kube-api-access-lz2kx\") pod \"auto-csr-approver-29567764-pqzcc\" (UID: \"0f957d4c-d9e0-4c92-ac81-bd5bdab751ad\") " pod="openshift-infra/auto-csr-approver-29567764-pqzcc" Mar 21 04:04:00 crc kubenswrapper[4685]: I0321 04:04:00.389992 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lz2kx\" (UniqueName: \"kubernetes.io/projected/0f957d4c-d9e0-4c92-ac81-bd5bdab751ad-kube-api-access-lz2kx\") pod \"auto-csr-approver-29567764-pqzcc\" (UID: \"0f957d4c-d9e0-4c92-ac81-bd5bdab751ad\") " pod="openshift-infra/auto-csr-approver-29567764-pqzcc" Mar 21 04:04:00 crc kubenswrapper[4685]: I0321 04:04:00.448987 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567764-pqzcc" Mar 21 04:04:01 crc kubenswrapper[4685]: I0321 04:04:01.726055 4685 generic.go:334] "Generic (PLEG): container finished" podID="1fee79fb-651e-42fb-a60a-55319c4666a3" containerID="089f0026a8c686205a2ed8e0b9efec14c6e581acad02ab00af33a41dc56d895b" exitCode=0 Mar 21 04:04:01 crc kubenswrapper[4685]: I0321 04:04:01.726108 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bcnrq" event={"ID":"1fee79fb-651e-42fb-a60a-55319c4666a3","Type":"ContainerDied","Data":"089f0026a8c686205a2ed8e0b9efec14c6e581acad02ab00af33a41dc56d895b"} Mar 21 04:04:05 crc kubenswrapper[4685]: I0321 04:04:05.755015 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bcnrq" event={"ID":"1fee79fb-651e-42fb-a60a-55319c4666a3","Type":"ContainerDied","Data":"5adb9b7cedf954468dece28a57c84279664499cc5b3731032809778606da7df3"} Mar 21 04:04:05 crc kubenswrapper[4685]: I0321 04:04:05.755737 4685 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5adb9b7cedf954468dece28a57c84279664499cc5b3731032809778606da7df3" Mar 21 04:04:05 crc kubenswrapper[4685]: I0321 04:04:05.790554 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bcnrq" Mar 21 04:04:05 crc kubenswrapper[4685]: I0321 04:04:05.936100 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fee79fb-651e-42fb-a60a-55319c4666a3-catalog-content\") pod \"1fee79fb-651e-42fb-a60a-55319c4666a3\" (UID: \"1fee79fb-651e-42fb-a60a-55319c4666a3\") " Mar 21 04:04:05 crc kubenswrapper[4685]: I0321 04:04:05.936445 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hztkl\" (UniqueName: \"kubernetes.io/projected/1fee79fb-651e-42fb-a60a-55319c4666a3-kube-api-access-hztkl\") pod \"1fee79fb-651e-42fb-a60a-55319c4666a3\" (UID: \"1fee79fb-651e-42fb-a60a-55319c4666a3\") " Mar 21 04:04:05 crc kubenswrapper[4685]: I0321 04:04:05.936562 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fee79fb-651e-42fb-a60a-55319c4666a3-utilities\") pod \"1fee79fb-651e-42fb-a60a-55319c4666a3\" (UID: \"1fee79fb-651e-42fb-a60a-55319c4666a3\") " Mar 21 04:04:05 crc kubenswrapper[4685]: I0321 04:04:05.937359 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fee79fb-651e-42fb-a60a-55319c4666a3-utilities" (OuterVolumeSpecName: "utilities") pod "1fee79fb-651e-42fb-a60a-55319c4666a3" (UID: "1fee79fb-651e-42fb-a60a-55319c4666a3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:04:05 crc kubenswrapper[4685]: I0321 04:04:05.946153 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fee79fb-651e-42fb-a60a-55319c4666a3-kube-api-access-hztkl" (OuterVolumeSpecName: "kube-api-access-hztkl") pod "1fee79fb-651e-42fb-a60a-55319c4666a3" (UID: "1fee79fb-651e-42fb-a60a-55319c4666a3"). InnerVolumeSpecName "kube-api-access-hztkl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:04:05 crc kubenswrapper[4685]: I0321 04:04:05.993663 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fee79fb-651e-42fb-a60a-55319c4666a3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1fee79fb-651e-42fb-a60a-55319c4666a3" (UID: "1fee79fb-651e-42fb-a60a-55319c4666a3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:04:06 crc kubenswrapper[4685]: I0321 04:04:06.038224 4685 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fee79fb-651e-42fb-a60a-55319c4666a3-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 04:04:06 crc kubenswrapper[4685]: I0321 04:04:06.038263 4685 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fee79fb-651e-42fb-a60a-55319c4666a3-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 04:04:06 crc kubenswrapper[4685]: I0321 04:04:06.038276 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hztkl\" (UniqueName: \"kubernetes.io/projected/1fee79fb-651e-42fb-a60a-55319c4666a3-kube-api-access-hztkl\") on node \"crc\" DevicePath \"\"" Mar 21 04:04:06 crc kubenswrapper[4685]: I0321 04:04:06.130381 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567764-pqzcc"] Mar 21 04:04:06 crc kubenswrapper[4685]: W0321 04:04:06.144384 4685 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0f957d4c_d9e0_4c92_ac81_bd5bdab751ad.slice/crio-b04c1007d482c7ae3413c6603bf8f029b84d1fd9d09045cbb49022c8048b922f WatchSource:0}: Error finding container b04c1007d482c7ae3413c6603bf8f029b84d1fd9d09045cbb49022c8048b922f: Status 404 returned error can't find the container with id b04c1007d482c7ae3413c6603bf8f029b84d1fd9d09045cbb49022c8048b922f Mar 21 04:04:06 crc kubenswrapper[4685]: I0321 04:04:06.762446 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5c456858cb-jnxfq" event={"ID":"7e33f8cc-f6fa-48ab-a172-74892478c268","Type":"ContainerStarted","Data":"e2d92cc4f8795aed53ebcd983b89eb996fa6974a422c65d6956148de1c17f3c7"} Mar 21 04:04:06 crc kubenswrapper[4685]: I0321 04:04:06.762763 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-5c456858cb-jnxfq" Mar 21 04:04:06 crc kubenswrapper[4685]: I0321 04:04:06.764569 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/rabbitmq-server-0" event={"ID":"d18693b0-ea2a-4795-a4de-15a379cc8490","Type":"ContainerStarted","Data":"d744b99d0f599149431f3be36e7e73a2b55609a1d321ca7ef292c0e1275ec425"} Mar 21 04:04:06 crc kubenswrapper[4685]: I0321 04:04:06.764799 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="barbican-kuttl-tests/rabbitmq-server-0" Mar 21 04:04:06 crc kubenswrapper[4685]: I0321 04:04:06.765767 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567764-pqzcc" event={"ID":"0f957d4c-d9e0-4c92-ac81-bd5bdab751ad","Type":"ContainerStarted","Data":"b04c1007d482c7ae3413c6603bf8f029b84d1fd9d09045cbb49022c8048b922f"} Mar 21 04:04:06 crc kubenswrapper[4685]: I0321 04:04:06.765811 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bcnrq" Mar 21 04:04:06 crc kubenswrapper[4685]: I0321 04:04:06.803330 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/rabbitmq-server-0" podStartSLOduration=43.214528957 podStartE2EDuration="49.803311024s" podCreationTimestamp="2026-03-21 04:03:17 +0000 UTC" firstStartedPulling="2026-03-21 04:03:19.834944789 +0000 UTC m=+1032.312013591" lastFinishedPulling="2026-03-21 04:03:26.423726856 +0000 UTC m=+1038.900795658" observedRunningTime="2026-03-21 04:04:06.801354628 +0000 UTC m=+1079.278423420" watchObservedRunningTime="2026-03-21 04:04:06.803311024 +0000 UTC m=+1079.280379816" Mar 21 04:04:06 crc kubenswrapper[4685]: I0321 04:04:06.806873 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-5c456858cb-jnxfq" podStartSLOduration=1.7990215840000001 podStartE2EDuration="14.806861307s" podCreationTimestamp="2026-03-21 04:03:52 +0000 UTC" firstStartedPulling="2026-03-21 04:03:52.955177887 +0000 UTC m=+1065.432246679" lastFinishedPulling="2026-03-21 04:04:05.96301761 +0000 UTC m=+1078.440086402" observedRunningTime="2026-03-21 04:04:06.784519321 +0000 UTC m=+1079.261588123" watchObservedRunningTime="2026-03-21 04:04:06.806861307 +0000 UTC m=+1079.283930099" Mar 21 04:04:06 crc kubenswrapper[4685]: I0321 04:04:06.816496 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bcnrq"] Mar 21 04:04:06 crc kubenswrapper[4685]: I0321 04:04:06.820678 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-bcnrq"] Mar 21 04:04:07 crc kubenswrapper[4685]: I0321 04:04:07.773222 4685 generic.go:334] "Generic (PLEG): container finished" podID="0f957d4c-d9e0-4c92-ac81-bd5bdab751ad" containerID="ece686729f1617c9e2677d81696e23494e78fee60fdf82dc773918bae0d3fbf9" exitCode=0 Mar 21 04:04:07 crc kubenswrapper[4685]: I0321 04:04:07.773288 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567764-pqzcc" event={"ID":"0f957d4c-d9e0-4c92-ac81-bd5bdab751ad","Type":"ContainerDied","Data":"ece686729f1617c9e2677d81696e23494e78fee60fdf82dc773918bae0d3fbf9"} Mar 21 04:04:08 crc kubenswrapper[4685]: I0321 04:04:08.310759 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fee79fb-651e-42fb-a60a-55319c4666a3" path="/var/lib/kubelet/pods/1fee79fb-651e-42fb-a60a-55319c4666a3/volumes" Mar 21 04:04:09 crc kubenswrapper[4685]: I0321 04:04:09.044074 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567764-pqzcc" Mar 21 04:04:09 crc kubenswrapper[4685]: I0321 04:04:09.179505 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz2kx\" (UniqueName: \"kubernetes.io/projected/0f957d4c-d9e0-4c92-ac81-bd5bdab751ad-kube-api-access-lz2kx\") pod \"0f957d4c-d9e0-4c92-ac81-bd5bdab751ad\" (UID: \"0f957d4c-d9e0-4c92-ac81-bd5bdab751ad\") " Mar 21 04:04:09 crc kubenswrapper[4685]: I0321 04:04:09.188588 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f957d4c-d9e0-4c92-ac81-bd5bdab751ad-kube-api-access-lz2kx" (OuterVolumeSpecName: "kube-api-access-lz2kx") pod "0f957d4c-d9e0-4c92-ac81-bd5bdab751ad" (UID: "0f957d4c-d9e0-4c92-ac81-bd5bdab751ad"). InnerVolumeSpecName "kube-api-access-lz2kx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:04:09 crc kubenswrapper[4685]: I0321 04:04:09.281253 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz2kx\" (UniqueName: \"kubernetes.io/projected/0f957d4c-d9e0-4c92-ac81-bd5bdab751ad-kube-api-access-lz2kx\") on node \"crc\" DevicePath \"\"" Mar 21 04:04:09 crc kubenswrapper[4685]: I0321 04:04:09.788362 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567764-pqzcc" event={"ID":"0f957d4c-d9e0-4c92-ac81-bd5bdab751ad","Type":"ContainerDied","Data":"b04c1007d482c7ae3413c6603bf8f029b84d1fd9d09045cbb49022c8048b922f"} Mar 21 04:04:09 crc kubenswrapper[4685]: I0321 04:04:09.788399 4685 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b04c1007d482c7ae3413c6603bf8f029b84d1fd9d09045cbb49022c8048b922f" Mar 21 04:04:09 crc kubenswrapper[4685]: I0321 04:04:09.788406 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567764-pqzcc" Mar 21 04:04:10 crc kubenswrapper[4685]: I0321 04:04:10.103913 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567758-77xb4"] Mar 21 04:04:10 crc kubenswrapper[4685]: I0321 04:04:10.112897 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567758-77xb4"] Mar 21 04:04:10 crc kubenswrapper[4685]: I0321 04:04:10.312379 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c94a3bd5-4817-40c9-8adc-d3bdf8e42180" path="/var/lib/kubelet/pods/c94a3bd5-4817-40c9-8adc-d3bdf8e42180/volumes" Mar 21 04:04:10 crc kubenswrapper[4685]: I0321 04:04:10.645196 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-d88th"] Mar 21 04:04:10 crc kubenswrapper[4685]: E0321 04:04:10.645491 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f957d4c-d9e0-4c92-ac81-bd5bdab751ad" containerName="oc" Mar 21 04:04:10 crc kubenswrapper[4685]: I0321 04:04:10.645506 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f957d4c-d9e0-4c92-ac81-bd5bdab751ad" containerName="oc" Mar 21 04:04:10 crc kubenswrapper[4685]: E0321 04:04:10.645526 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fee79fb-651e-42fb-a60a-55319c4666a3" containerName="extract-utilities" Mar 21 04:04:10 crc kubenswrapper[4685]: I0321 04:04:10.645536 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fee79fb-651e-42fb-a60a-55319c4666a3" containerName="extract-utilities" Mar 21 04:04:10 crc kubenswrapper[4685]: E0321 04:04:10.645563 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fee79fb-651e-42fb-a60a-55319c4666a3" containerName="extract-content" Mar 21 04:04:10 crc kubenswrapper[4685]: I0321 04:04:10.645573 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fee79fb-651e-42fb-a60a-55319c4666a3" containerName="extract-content" Mar 21 04:04:10 crc kubenswrapper[4685]: E0321 04:04:10.645590 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fee79fb-651e-42fb-a60a-55319c4666a3" containerName="registry-server" Mar 21 04:04:10 crc kubenswrapper[4685]: I0321 04:04:10.645598 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fee79fb-651e-42fb-a60a-55319c4666a3" containerName="registry-server" Mar 21 04:04:10 crc kubenswrapper[4685]: I0321 04:04:10.645738 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fee79fb-651e-42fb-a60a-55319c4666a3" containerName="registry-server" Mar 21 04:04:10 crc kubenswrapper[4685]: I0321 04:04:10.645753 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f957d4c-d9e0-4c92-ac81-bd5bdab751ad" containerName="oc" Mar 21 04:04:10 crc kubenswrapper[4685]: I0321 04:04:10.647004 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d88th" Mar 21 04:04:10 crc kubenswrapper[4685]: I0321 04:04:10.654770 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d88th"] Mar 21 04:04:10 crc kubenswrapper[4685]: I0321 04:04:10.800110 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgwd2\" (UniqueName: \"kubernetes.io/projected/0352254e-d6b6-42ed-8c15-d44ce48e19a9-kube-api-access-mgwd2\") pod \"redhat-marketplace-d88th\" (UID: \"0352254e-d6b6-42ed-8c15-d44ce48e19a9\") " pod="openshift-marketplace/redhat-marketplace-d88th" Mar 21 04:04:10 crc kubenswrapper[4685]: I0321 04:04:10.800173 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0352254e-d6b6-42ed-8c15-d44ce48e19a9-utilities\") pod \"redhat-marketplace-d88th\" (UID: \"0352254e-d6b6-42ed-8c15-d44ce48e19a9\") " pod="openshift-marketplace/redhat-marketplace-d88th" Mar 21 04:04:10 crc kubenswrapper[4685]: I0321 04:04:10.800206 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0352254e-d6b6-42ed-8c15-d44ce48e19a9-catalog-content\") pod \"redhat-marketplace-d88th\" (UID: \"0352254e-d6b6-42ed-8c15-d44ce48e19a9\") " pod="openshift-marketplace/redhat-marketplace-d88th" Mar 21 04:04:10 crc kubenswrapper[4685]: I0321 04:04:10.902124 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgwd2\" (UniqueName: \"kubernetes.io/projected/0352254e-d6b6-42ed-8c15-d44ce48e19a9-kube-api-access-mgwd2\") pod \"redhat-marketplace-d88th\" (UID: \"0352254e-d6b6-42ed-8c15-d44ce48e19a9\") " pod="openshift-marketplace/redhat-marketplace-d88th" Mar 21 04:04:10 crc kubenswrapper[4685]: I0321 04:04:10.902182 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0352254e-d6b6-42ed-8c15-d44ce48e19a9-utilities\") pod \"redhat-marketplace-d88th\" (UID: \"0352254e-d6b6-42ed-8c15-d44ce48e19a9\") " pod="openshift-marketplace/redhat-marketplace-d88th" Mar 21 04:04:10 crc kubenswrapper[4685]: I0321 04:04:10.902219 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0352254e-d6b6-42ed-8c15-d44ce48e19a9-catalog-content\") pod \"redhat-marketplace-d88th\" (UID: \"0352254e-d6b6-42ed-8c15-d44ce48e19a9\") " pod="openshift-marketplace/redhat-marketplace-d88th" Mar 21 04:04:10 crc kubenswrapper[4685]: I0321 04:04:10.902630 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0352254e-d6b6-42ed-8c15-d44ce48e19a9-catalog-content\") pod \"redhat-marketplace-d88th\" (UID: \"0352254e-d6b6-42ed-8c15-d44ce48e19a9\") " pod="openshift-marketplace/redhat-marketplace-d88th" Mar 21 04:04:10 crc kubenswrapper[4685]: I0321 04:04:10.902765 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0352254e-d6b6-42ed-8c15-d44ce48e19a9-utilities\") pod \"redhat-marketplace-d88th\" (UID: \"0352254e-d6b6-42ed-8c15-d44ce48e19a9\") " pod="openshift-marketplace/redhat-marketplace-d88th" Mar 21 04:04:10 crc kubenswrapper[4685]: I0321 04:04:10.919708 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgwd2\" (UniqueName: \"kubernetes.io/projected/0352254e-d6b6-42ed-8c15-d44ce48e19a9-kube-api-access-mgwd2\") pod \"redhat-marketplace-d88th\" (UID: \"0352254e-d6b6-42ed-8c15-d44ce48e19a9\") " pod="openshift-marketplace/redhat-marketplace-d88th" Mar 21 04:04:10 crc kubenswrapper[4685]: I0321 04:04:10.965109 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d88th" Mar 21 04:04:11 crc kubenswrapper[4685]: I0321 04:04:11.349762 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d88th"] Mar 21 04:04:11 crc kubenswrapper[4685]: I0321 04:04:11.799890 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d88th" event={"ID":"0352254e-d6b6-42ed-8c15-d44ce48e19a9","Type":"ContainerStarted","Data":"d39d03cc095cb09398df2beb6f71650f5c2bf4eed1f9a3b7e70ab98b8052409d"} Mar 21 04:04:13 crc kubenswrapper[4685]: I0321 04:04:13.119456 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-5c456858cb-jnxfq" Mar 21 04:04:13 crc kubenswrapper[4685]: E0321 04:04:13.352866 4685 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0352254e_d6b6_42ed_8c15_d44ce48e19a9.slice/crio-8ed1033c651dbe52868ae5e7a6467de0ccb242004d2c15f62b7ff199942c876a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0352254e_d6b6_42ed_8c15_d44ce48e19a9.slice/crio-conmon-8ed1033c651dbe52868ae5e7a6467de0ccb242004d2c15f62b7ff199942c876a.scope\": RecentStats: unable to find data in memory cache]" Mar 21 04:04:14 crc kubenswrapper[4685]: I0321 04:04:14.125186 4685 generic.go:334] "Generic (PLEG): container finished" podID="0352254e-d6b6-42ed-8c15-d44ce48e19a9" containerID="8ed1033c651dbe52868ae5e7a6467de0ccb242004d2c15f62b7ff199942c876a" exitCode=0 Mar 21 04:04:14 crc kubenswrapper[4685]: I0321 04:04:14.125227 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d88th" event={"ID":"0352254e-d6b6-42ed-8c15-d44ce48e19a9","Type":"ContainerDied","Data":"8ed1033c651dbe52868ae5e7a6467de0ccb242004d2c15f62b7ff199942c876a"} Mar 21 04:04:16 crc kubenswrapper[4685]: I0321 04:04:16.141784 4685 generic.go:334] "Generic (PLEG): container finished" podID="0352254e-d6b6-42ed-8c15-d44ce48e19a9" containerID="2d083082abe6921d0bd84cb0137ce2760d0a038e76bfbd8648175db07428b2fe" exitCode=0 Mar 21 04:04:16 crc kubenswrapper[4685]: I0321 04:04:16.142111 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d88th" event={"ID":"0352254e-d6b6-42ed-8c15-d44ce48e19a9","Type":"ContainerDied","Data":"2d083082abe6921d0bd84cb0137ce2760d0a038e76bfbd8648175db07428b2fe"} Mar 21 04:04:17 crc kubenswrapper[4685]: I0321 04:04:17.150819 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d88th" event={"ID":"0352254e-d6b6-42ed-8c15-d44ce48e19a9","Type":"ContainerStarted","Data":"c37b5e87bd75595227b0130281ba74e7d714f94fffec0a69ac1ce4a9a214c1fc"} Mar 21 04:04:17 crc kubenswrapper[4685]: I0321 04:04:17.167894 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-d88th" podStartSLOduration=4.438391407 podStartE2EDuration="7.167854635s" podCreationTimestamp="2026-03-21 04:04:10 +0000 UTC" firstStartedPulling="2026-03-21 04:04:14.127166176 +0000 UTC m=+1086.604234968" lastFinishedPulling="2026-03-21 04:04:16.856629404 +0000 UTC m=+1089.333698196" observedRunningTime="2026-03-21 04:04:17.166107644 +0000 UTC m=+1089.643176436" watchObservedRunningTime="2026-03-21 04:04:17.167854635 +0000 UTC m=+1089.644945417" Mar 21 04:04:18 crc kubenswrapper[4685]: I0321 04:04:18.272555 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/keystone-e186-account-create-update-b7zr6"] Mar 21 04:04:18 crc kubenswrapper[4685]: I0321 04:04:18.273686 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-e186-account-create-update-b7zr6" Mar 21 04:04:18 crc kubenswrapper[4685]: I0321 04:04:18.276812 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/keystone-db-create-27vgk"] Mar 21 04:04:18 crc kubenswrapper[4685]: I0321 04:04:18.277604 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-db-create-27vgk" Mar 21 04:04:18 crc kubenswrapper[4685]: I0321 04:04:18.284978 4685 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"keystone-db-secret" Mar 21 04:04:18 crc kubenswrapper[4685]: I0321 04:04:18.314707 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/keystone-e186-account-create-update-b7zr6"] Mar 21 04:04:18 crc kubenswrapper[4685]: I0321 04:04:18.317792 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/keystone-db-create-27vgk"] Mar 21 04:04:18 crc kubenswrapper[4685]: I0321 04:04:18.318429 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccr6k\" (UniqueName: \"kubernetes.io/projected/83bd1c48-1210-4b1a-af10-08b876ec7665-kube-api-access-ccr6k\") pod \"keystone-e186-account-create-update-b7zr6\" (UID: \"83bd1c48-1210-4b1a-af10-08b876ec7665\") " pod="barbican-kuttl-tests/keystone-e186-account-create-update-b7zr6" Mar 21 04:04:18 crc kubenswrapper[4685]: I0321 04:04:18.318471 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/83bd1c48-1210-4b1a-af10-08b876ec7665-operator-scripts\") pod \"keystone-e186-account-create-update-b7zr6\" (UID: \"83bd1c48-1210-4b1a-af10-08b876ec7665\") " pod="barbican-kuttl-tests/keystone-e186-account-create-update-b7zr6" Mar 21 04:04:18 crc kubenswrapper[4685]: I0321 04:04:18.420027 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8k5b\" (UniqueName: \"kubernetes.io/projected/561e0682-6870-4559-b124-759d2f133a56-kube-api-access-m8k5b\") pod \"keystone-db-create-27vgk\" (UID: \"561e0682-6870-4559-b124-759d2f133a56\") " pod="barbican-kuttl-tests/keystone-db-create-27vgk" Mar 21 04:04:18 crc kubenswrapper[4685]: I0321 04:04:18.420089 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/561e0682-6870-4559-b124-759d2f133a56-operator-scripts\") pod \"keystone-db-create-27vgk\" (UID: \"561e0682-6870-4559-b124-759d2f133a56\") " pod="barbican-kuttl-tests/keystone-db-create-27vgk" Mar 21 04:04:18 crc kubenswrapper[4685]: I0321 04:04:18.420144 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccr6k\" (UniqueName: \"kubernetes.io/projected/83bd1c48-1210-4b1a-af10-08b876ec7665-kube-api-access-ccr6k\") pod \"keystone-e186-account-create-update-b7zr6\" (UID: \"83bd1c48-1210-4b1a-af10-08b876ec7665\") " pod="barbican-kuttl-tests/keystone-e186-account-create-update-b7zr6" Mar 21 04:04:18 crc kubenswrapper[4685]: I0321 04:04:18.420170 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/83bd1c48-1210-4b1a-af10-08b876ec7665-operator-scripts\") pod \"keystone-e186-account-create-update-b7zr6\" (UID: \"83bd1c48-1210-4b1a-af10-08b876ec7665\") " pod="barbican-kuttl-tests/keystone-e186-account-create-update-b7zr6" Mar 21 04:04:18 crc kubenswrapper[4685]: I0321 04:04:18.420828 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/83bd1c48-1210-4b1a-af10-08b876ec7665-operator-scripts\") pod \"keystone-e186-account-create-update-b7zr6\" (UID: \"83bd1c48-1210-4b1a-af10-08b876ec7665\") " pod="barbican-kuttl-tests/keystone-e186-account-create-update-b7zr6" Mar 21 04:04:18 crc kubenswrapper[4685]: I0321 04:04:18.442004 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccr6k\" (UniqueName: \"kubernetes.io/projected/83bd1c48-1210-4b1a-af10-08b876ec7665-kube-api-access-ccr6k\") pod \"keystone-e186-account-create-update-b7zr6\" (UID: \"83bd1c48-1210-4b1a-af10-08b876ec7665\") " pod="barbican-kuttl-tests/keystone-e186-account-create-update-b7zr6" Mar 21 04:04:18 crc kubenswrapper[4685]: I0321 04:04:18.521876 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8k5b\" (UniqueName: \"kubernetes.io/projected/561e0682-6870-4559-b124-759d2f133a56-kube-api-access-m8k5b\") pod \"keystone-db-create-27vgk\" (UID: \"561e0682-6870-4559-b124-759d2f133a56\") " pod="barbican-kuttl-tests/keystone-db-create-27vgk" Mar 21 04:04:18 crc kubenswrapper[4685]: I0321 04:04:18.521945 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/561e0682-6870-4559-b124-759d2f133a56-operator-scripts\") pod \"keystone-db-create-27vgk\" (UID: \"561e0682-6870-4559-b124-759d2f133a56\") " pod="barbican-kuttl-tests/keystone-db-create-27vgk" Mar 21 04:04:18 crc kubenswrapper[4685]: I0321 04:04:18.522796 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/561e0682-6870-4559-b124-759d2f133a56-operator-scripts\") pod \"keystone-db-create-27vgk\" (UID: \"561e0682-6870-4559-b124-759d2f133a56\") " pod="barbican-kuttl-tests/keystone-db-create-27vgk" Mar 21 04:04:18 crc kubenswrapper[4685]: I0321 04:04:18.541531 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8k5b\" (UniqueName: \"kubernetes.io/projected/561e0682-6870-4559-b124-759d2f133a56-kube-api-access-m8k5b\") pod \"keystone-db-create-27vgk\" (UID: \"561e0682-6870-4559-b124-759d2f133a56\") " pod="barbican-kuttl-tests/keystone-db-create-27vgk" Mar 21 04:04:18 crc kubenswrapper[4685]: I0321 04:04:18.588712 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-e186-account-create-update-b7zr6" Mar 21 04:04:18 crc kubenswrapper[4685]: I0321 04:04:18.596759 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-db-create-27vgk" Mar 21 04:04:18 crc kubenswrapper[4685]: I0321 04:04:18.869930 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/keystone-db-create-27vgk"] Mar 21 04:04:19 crc kubenswrapper[4685]: I0321 04:04:19.018664 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/keystone-e186-account-create-update-b7zr6"] Mar 21 04:04:19 crc kubenswrapper[4685]: I0321 04:04:19.165472 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-db-create-27vgk" event={"ID":"561e0682-6870-4559-b124-759d2f133a56","Type":"ContainerStarted","Data":"048b3767f6dddcdbfd89a1ee56c73e3f2bd43d828a6c717095fa7ae4777fd5b4"} Mar 21 04:04:19 crc kubenswrapper[4685]: I0321 04:04:19.165526 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-db-create-27vgk" event={"ID":"561e0682-6870-4559-b124-759d2f133a56","Type":"ContainerStarted","Data":"04bb44cf8ff744ba939bb20432ea948994d8b3e826b9e41d79317b98d4bd55f4"} Mar 21 04:04:19 crc kubenswrapper[4685]: I0321 04:04:19.167469 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-e186-account-create-update-b7zr6" event={"ID":"83bd1c48-1210-4b1a-af10-08b876ec7665","Type":"ContainerStarted","Data":"5c9ad915bb8485684b25dcbb01306ea349ffb95abbd38192e8ee720fef63b8c3"} Mar 21 04:04:19 crc kubenswrapper[4685]: I0321 04:04:19.167520 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-e186-account-create-update-b7zr6" event={"ID":"83bd1c48-1210-4b1a-af10-08b876ec7665","Type":"ContainerStarted","Data":"f907fec1e09ede20e397aea106341c6f44646300b7523966caddb8529415edb2"} Mar 21 04:04:19 crc kubenswrapper[4685]: I0321 04:04:19.182995 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/keystone-db-create-27vgk" podStartSLOduration=1.182972267 podStartE2EDuration="1.182972267s" podCreationTimestamp="2026-03-21 04:04:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:04:19.177455148 +0000 UTC m=+1091.654523950" watchObservedRunningTime="2026-03-21 04:04:19.182972267 +0000 UTC m=+1091.660041059" Mar 21 04:04:19 crc kubenswrapper[4685]: I0321 04:04:19.396901 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="barbican-kuttl-tests/rabbitmq-server-0" Mar 21 04:04:19 crc kubenswrapper[4685]: I0321 04:04:19.417880 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/keystone-e186-account-create-update-b7zr6" podStartSLOduration=1.417859812 podStartE2EDuration="1.417859812s" podCreationTimestamp="2026-03-21 04:04:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:04:19.194390927 +0000 UTC m=+1091.671459729" watchObservedRunningTime="2026-03-21 04:04:19.417859812 +0000 UTC m=+1091.894928764" Mar 21 04:04:20 crc kubenswrapper[4685]: I0321 04:04:20.174308 4685 generic.go:334] "Generic (PLEG): container finished" podID="561e0682-6870-4559-b124-759d2f133a56" containerID="048b3767f6dddcdbfd89a1ee56c73e3f2bd43d828a6c717095fa7ae4777fd5b4" exitCode=0 Mar 21 04:04:20 crc kubenswrapper[4685]: I0321 04:04:20.174366 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-db-create-27vgk" event={"ID":"561e0682-6870-4559-b124-759d2f133a56","Type":"ContainerDied","Data":"048b3767f6dddcdbfd89a1ee56c73e3f2bd43d828a6c717095fa7ae4777fd5b4"} Mar 21 04:04:20 crc kubenswrapper[4685]: I0321 04:04:20.176326 4685 generic.go:334] "Generic (PLEG): container finished" podID="83bd1c48-1210-4b1a-af10-08b876ec7665" containerID="5c9ad915bb8485684b25dcbb01306ea349ffb95abbd38192e8ee720fef63b8c3" exitCode=0 Mar 21 04:04:20 crc kubenswrapper[4685]: I0321 04:04:20.176389 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-e186-account-create-update-b7zr6" event={"ID":"83bd1c48-1210-4b1a-af10-08b876ec7665","Type":"ContainerDied","Data":"5c9ad915bb8485684b25dcbb01306ea349ffb95abbd38192e8ee720fef63b8c3"} Mar 21 04:04:20 crc kubenswrapper[4685]: I0321 04:04:20.965807 4685 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-d88th" Mar 21 04:04:20 crc kubenswrapper[4685]: I0321 04:04:20.966604 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-d88th" Mar 21 04:04:21 crc kubenswrapper[4685]: I0321 04:04:21.036379 4685 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-d88th" Mar 21 04:04:21 crc kubenswrapper[4685]: I0321 04:04:21.233268 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-d88th" Mar 21 04:04:21 crc kubenswrapper[4685]: I0321 04:04:21.491886 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-e186-account-create-update-b7zr6" Mar 21 04:04:21 crc kubenswrapper[4685]: I0321 04:04:21.496382 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-db-create-27vgk" Mar 21 04:04:21 crc kubenswrapper[4685]: I0321 04:04:21.561544 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/561e0682-6870-4559-b124-759d2f133a56-operator-scripts\") pod \"561e0682-6870-4559-b124-759d2f133a56\" (UID: \"561e0682-6870-4559-b124-759d2f133a56\") " Mar 21 04:04:21 crc kubenswrapper[4685]: I0321 04:04:21.562011 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/561e0682-6870-4559-b124-759d2f133a56-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "561e0682-6870-4559-b124-759d2f133a56" (UID: "561e0682-6870-4559-b124-759d2f133a56"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:04:21 crc kubenswrapper[4685]: I0321 04:04:21.562093 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m8k5b\" (UniqueName: \"kubernetes.io/projected/561e0682-6870-4559-b124-759d2f133a56-kube-api-access-m8k5b\") pod \"561e0682-6870-4559-b124-759d2f133a56\" (UID: \"561e0682-6870-4559-b124-759d2f133a56\") " Mar 21 04:04:21 crc kubenswrapper[4685]: I0321 04:04:21.562171 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/83bd1c48-1210-4b1a-af10-08b876ec7665-operator-scripts\") pod \"83bd1c48-1210-4b1a-af10-08b876ec7665\" (UID: \"83bd1c48-1210-4b1a-af10-08b876ec7665\") " Mar 21 04:04:21 crc kubenswrapper[4685]: I0321 04:04:21.562258 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ccr6k\" (UniqueName: \"kubernetes.io/projected/83bd1c48-1210-4b1a-af10-08b876ec7665-kube-api-access-ccr6k\") pod \"83bd1c48-1210-4b1a-af10-08b876ec7665\" (UID: \"83bd1c48-1210-4b1a-af10-08b876ec7665\") " Mar 21 04:04:21 crc kubenswrapper[4685]: I0321 04:04:21.562512 4685 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/561e0682-6870-4559-b124-759d2f133a56-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:04:21 crc kubenswrapper[4685]: I0321 04:04:21.564381 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83bd1c48-1210-4b1a-af10-08b876ec7665-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "83bd1c48-1210-4b1a-af10-08b876ec7665" (UID: "83bd1c48-1210-4b1a-af10-08b876ec7665"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:04:21 crc kubenswrapper[4685]: I0321 04:04:21.567534 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/561e0682-6870-4559-b124-759d2f133a56-kube-api-access-m8k5b" (OuterVolumeSpecName: "kube-api-access-m8k5b") pod "561e0682-6870-4559-b124-759d2f133a56" (UID: "561e0682-6870-4559-b124-759d2f133a56"). InnerVolumeSpecName "kube-api-access-m8k5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:04:21 crc kubenswrapper[4685]: I0321 04:04:21.568052 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83bd1c48-1210-4b1a-af10-08b876ec7665-kube-api-access-ccr6k" (OuterVolumeSpecName: "kube-api-access-ccr6k") pod "83bd1c48-1210-4b1a-af10-08b876ec7665" (UID: "83bd1c48-1210-4b1a-af10-08b876ec7665"). InnerVolumeSpecName "kube-api-access-ccr6k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:04:21 crc kubenswrapper[4685]: I0321 04:04:21.664097 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m8k5b\" (UniqueName: \"kubernetes.io/projected/561e0682-6870-4559-b124-759d2f133a56-kube-api-access-m8k5b\") on node \"crc\" DevicePath \"\"" Mar 21 04:04:21 crc kubenswrapper[4685]: I0321 04:04:21.664130 4685 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/83bd1c48-1210-4b1a-af10-08b876ec7665-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:04:21 crc kubenswrapper[4685]: I0321 04:04:21.664142 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ccr6k\" (UniqueName: \"kubernetes.io/projected/83bd1c48-1210-4b1a-af10-08b876ec7665-kube-api-access-ccr6k\") on node \"crc\" DevicePath \"\"" Mar 21 04:04:22 crc kubenswrapper[4685]: I0321 04:04:22.191883 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-db-create-27vgk" event={"ID":"561e0682-6870-4559-b124-759d2f133a56","Type":"ContainerDied","Data":"04bb44cf8ff744ba939bb20432ea948994d8b3e826b9e41d79317b98d4bd55f4"} Mar 21 04:04:22 crc kubenswrapper[4685]: I0321 04:04:22.191905 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-db-create-27vgk" Mar 21 04:04:22 crc kubenswrapper[4685]: I0321 04:04:22.192040 4685 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04bb44cf8ff744ba939bb20432ea948994d8b3e826b9e41d79317b98d4bd55f4" Mar 21 04:04:22 crc kubenswrapper[4685]: I0321 04:04:22.193612 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-e186-account-create-update-b7zr6" event={"ID":"83bd1c48-1210-4b1a-af10-08b876ec7665","Type":"ContainerDied","Data":"f907fec1e09ede20e397aea106341c6f44646300b7523966caddb8529415edb2"} Mar 21 04:04:22 crc kubenswrapper[4685]: I0321 04:04:22.193640 4685 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f907fec1e09ede20e397aea106341c6f44646300b7523966caddb8529415edb2" Mar 21 04:04:22 crc kubenswrapper[4685]: I0321 04:04:22.193654 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-e186-account-create-update-b7zr6" Mar 21 04:04:23 crc kubenswrapper[4685]: I0321 04:04:23.058490 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-index-6hfc5"] Mar 21 04:04:23 crc kubenswrapper[4685]: E0321 04:04:23.059593 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="561e0682-6870-4559-b124-759d2f133a56" containerName="mariadb-database-create" Mar 21 04:04:23 crc kubenswrapper[4685]: I0321 04:04:23.059620 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="561e0682-6870-4559-b124-759d2f133a56" containerName="mariadb-database-create" Mar 21 04:04:23 crc kubenswrapper[4685]: E0321 04:04:23.059648 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83bd1c48-1210-4b1a-af10-08b876ec7665" containerName="mariadb-account-create-update" Mar 21 04:04:23 crc kubenswrapper[4685]: I0321 04:04:23.059659 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="83bd1c48-1210-4b1a-af10-08b876ec7665" containerName="mariadb-account-create-update" Mar 21 04:04:23 crc kubenswrapper[4685]: I0321 04:04:23.059817 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="83bd1c48-1210-4b1a-af10-08b876ec7665" containerName="mariadb-account-create-update" Mar 21 04:04:23 crc kubenswrapper[4685]: I0321 04:04:23.059842 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="561e0682-6870-4559-b124-759d2f133a56" containerName="mariadb-database-create" Mar 21 04:04:23 crc kubenswrapper[4685]: I0321 04:04:23.060430 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-index-6hfc5" Mar 21 04:04:23 crc kubenswrapper[4685]: I0321 04:04:23.066985 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-index-dockercfg-7mxjn" Mar 21 04:04:23 crc kubenswrapper[4685]: I0321 04:04:23.076506 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-index-6hfc5"] Mar 21 04:04:23 crc kubenswrapper[4685]: I0321 04:04:23.182199 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kj7gs\" (UniqueName: \"kubernetes.io/projected/ab0d8ea3-4677-4647-afbf-5fa241927ff7-kube-api-access-kj7gs\") pod \"barbican-operator-index-6hfc5\" (UID: \"ab0d8ea3-4677-4647-afbf-5fa241927ff7\") " pod="openstack-operators/barbican-operator-index-6hfc5" Mar 21 04:04:23 crc kubenswrapper[4685]: I0321 04:04:23.283865 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kj7gs\" (UniqueName: \"kubernetes.io/projected/ab0d8ea3-4677-4647-afbf-5fa241927ff7-kube-api-access-kj7gs\") pod \"barbican-operator-index-6hfc5\" (UID: \"ab0d8ea3-4677-4647-afbf-5fa241927ff7\") " pod="openstack-operators/barbican-operator-index-6hfc5" Mar 21 04:04:23 crc kubenswrapper[4685]: I0321 04:04:23.302184 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kj7gs\" (UniqueName: \"kubernetes.io/projected/ab0d8ea3-4677-4647-afbf-5fa241927ff7-kube-api-access-kj7gs\") pod \"barbican-operator-index-6hfc5\" (UID: \"ab0d8ea3-4677-4647-afbf-5fa241927ff7\") " pod="openstack-operators/barbican-operator-index-6hfc5" Mar 21 04:04:23 crc kubenswrapper[4685]: I0321 04:04:23.377129 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-index-6hfc5" Mar 21 04:04:23 crc kubenswrapper[4685]: I0321 04:04:23.712254 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/keystone-db-sync-cdf4l"] Mar 21 04:04:23 crc kubenswrapper[4685]: I0321 04:04:23.713365 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-db-sync-cdf4l" Mar 21 04:04:23 crc kubenswrapper[4685]: I0321 04:04:23.715425 4685 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"keystone-config-data" Mar 21 04:04:23 crc kubenswrapper[4685]: I0321 04:04:23.715956 4685 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"keystone-scripts" Mar 21 04:04:23 crc kubenswrapper[4685]: I0321 04:04:23.720359 4685 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"keystone-keystone-dockercfg-mvlf5" Mar 21 04:04:23 crc kubenswrapper[4685]: I0321 04:04:23.721250 4685 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"keystone" Mar 21 04:04:23 crc kubenswrapper[4685]: I0321 04:04:23.726119 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/keystone-db-sync-cdf4l"] Mar 21 04:04:23 crc kubenswrapper[4685]: I0321 04:04:23.774213 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-index-6hfc5"] Mar 21 04:04:23 crc kubenswrapper[4685]: W0321 04:04:23.785088 4685 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab0d8ea3_4677_4647_afbf_5fa241927ff7.slice/crio-3d41849ebef67437d34979e15b893a0e1b2358cbcbb03d787e9e348b25b6e23a WatchSource:0}: Error finding container 3d41849ebef67437d34979e15b893a0e1b2358cbcbb03d787e9e348b25b6e23a: Status 404 returned error can't find the container with id 3d41849ebef67437d34979e15b893a0e1b2358cbcbb03d787e9e348b25b6e23a Mar 21 04:04:23 crc kubenswrapper[4685]: I0321 04:04:23.792188 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgkhb\" (UniqueName: \"kubernetes.io/projected/8ff6ee16-ea1a-4725-b13c-ec201554a350-kube-api-access-jgkhb\") pod \"keystone-db-sync-cdf4l\" (UID: \"8ff6ee16-ea1a-4725-b13c-ec201554a350\") " pod="barbican-kuttl-tests/keystone-db-sync-cdf4l" Mar 21 04:04:23 crc kubenswrapper[4685]: I0321 04:04:23.792261 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ff6ee16-ea1a-4725-b13c-ec201554a350-config-data\") pod \"keystone-db-sync-cdf4l\" (UID: \"8ff6ee16-ea1a-4725-b13c-ec201554a350\") " pod="barbican-kuttl-tests/keystone-db-sync-cdf4l" Mar 21 04:04:23 crc kubenswrapper[4685]: I0321 04:04:23.893137 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgkhb\" (UniqueName: \"kubernetes.io/projected/8ff6ee16-ea1a-4725-b13c-ec201554a350-kube-api-access-jgkhb\") pod \"keystone-db-sync-cdf4l\" (UID: \"8ff6ee16-ea1a-4725-b13c-ec201554a350\") " pod="barbican-kuttl-tests/keystone-db-sync-cdf4l" Mar 21 04:04:23 crc kubenswrapper[4685]: I0321 04:04:23.893203 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ff6ee16-ea1a-4725-b13c-ec201554a350-config-data\") pod \"keystone-db-sync-cdf4l\" (UID: \"8ff6ee16-ea1a-4725-b13c-ec201554a350\") " pod="barbican-kuttl-tests/keystone-db-sync-cdf4l" Mar 21 04:04:23 crc kubenswrapper[4685]: I0321 04:04:23.898677 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ff6ee16-ea1a-4725-b13c-ec201554a350-config-data\") pod \"keystone-db-sync-cdf4l\" (UID: \"8ff6ee16-ea1a-4725-b13c-ec201554a350\") " pod="barbican-kuttl-tests/keystone-db-sync-cdf4l" Mar 21 04:04:23 crc kubenswrapper[4685]: I0321 04:04:23.912127 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgkhb\" (UniqueName: \"kubernetes.io/projected/8ff6ee16-ea1a-4725-b13c-ec201554a350-kube-api-access-jgkhb\") pod \"keystone-db-sync-cdf4l\" (UID: \"8ff6ee16-ea1a-4725-b13c-ec201554a350\") " pod="barbican-kuttl-tests/keystone-db-sync-cdf4l" Mar 21 04:04:24 crc kubenswrapper[4685]: I0321 04:04:24.032640 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-db-sync-cdf4l" Mar 21 04:04:24 crc kubenswrapper[4685]: I0321 04:04:24.215816 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-index-6hfc5" event={"ID":"ab0d8ea3-4677-4647-afbf-5fa241927ff7","Type":"ContainerStarted","Data":"3d41849ebef67437d34979e15b893a0e1b2358cbcbb03d787e9e348b25b6e23a"} Mar 21 04:04:24 crc kubenswrapper[4685]: I0321 04:04:24.261470 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/keystone-db-sync-cdf4l"] Mar 21 04:04:24 crc kubenswrapper[4685]: W0321 04:04:24.264360 4685 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ff6ee16_ea1a_4725_b13c_ec201554a350.slice/crio-392130b085480d049ece401f37b5fc4b1563286ba1a5f0d2745577ac08d4d2f6 WatchSource:0}: Error finding container 392130b085480d049ece401f37b5fc4b1563286ba1a5f0d2745577ac08d4d2f6: Status 404 returned error can't find the container with id 392130b085480d049ece401f37b5fc4b1563286ba1a5f0d2745577ac08d4d2f6 Mar 21 04:04:24 crc kubenswrapper[4685]: I0321 04:04:24.443389 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d88th"] Mar 21 04:04:24 crc kubenswrapper[4685]: I0321 04:04:24.443786 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-d88th" podUID="0352254e-d6b6-42ed-8c15-d44ce48e19a9" containerName="registry-server" containerID="cri-o://c37b5e87bd75595227b0130281ba74e7d714f94fffec0a69ac1ce4a9a214c1fc" gracePeriod=2 Mar 21 04:04:25 crc kubenswrapper[4685]: I0321 04:04:25.016914 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d88th" Mar 21 04:04:25 crc kubenswrapper[4685]: I0321 04:04:25.110712 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0352254e-d6b6-42ed-8c15-d44ce48e19a9-catalog-content\") pod \"0352254e-d6b6-42ed-8c15-d44ce48e19a9\" (UID: \"0352254e-d6b6-42ed-8c15-d44ce48e19a9\") " Mar 21 04:04:25 crc kubenswrapper[4685]: I0321 04:04:25.110837 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0352254e-d6b6-42ed-8c15-d44ce48e19a9-utilities\") pod \"0352254e-d6b6-42ed-8c15-d44ce48e19a9\" (UID: \"0352254e-d6b6-42ed-8c15-d44ce48e19a9\") " Mar 21 04:04:25 crc kubenswrapper[4685]: I0321 04:04:25.110922 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mgwd2\" (UniqueName: \"kubernetes.io/projected/0352254e-d6b6-42ed-8c15-d44ce48e19a9-kube-api-access-mgwd2\") pod \"0352254e-d6b6-42ed-8c15-d44ce48e19a9\" (UID: \"0352254e-d6b6-42ed-8c15-d44ce48e19a9\") " Mar 21 04:04:25 crc kubenswrapper[4685]: I0321 04:04:25.111758 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0352254e-d6b6-42ed-8c15-d44ce48e19a9-utilities" (OuterVolumeSpecName: "utilities") pod "0352254e-d6b6-42ed-8c15-d44ce48e19a9" (UID: "0352254e-d6b6-42ed-8c15-d44ce48e19a9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:04:25 crc kubenswrapper[4685]: I0321 04:04:25.115980 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0352254e-d6b6-42ed-8c15-d44ce48e19a9-kube-api-access-mgwd2" (OuterVolumeSpecName: "kube-api-access-mgwd2") pod "0352254e-d6b6-42ed-8c15-d44ce48e19a9" (UID: "0352254e-d6b6-42ed-8c15-d44ce48e19a9"). InnerVolumeSpecName "kube-api-access-mgwd2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:04:25 crc kubenswrapper[4685]: I0321 04:04:25.154300 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0352254e-d6b6-42ed-8c15-d44ce48e19a9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0352254e-d6b6-42ed-8c15-d44ce48e19a9" (UID: "0352254e-d6b6-42ed-8c15-d44ce48e19a9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:04:25 crc kubenswrapper[4685]: I0321 04:04:25.212540 4685 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0352254e-d6b6-42ed-8c15-d44ce48e19a9-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 04:04:25 crc kubenswrapper[4685]: I0321 04:04:25.212594 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mgwd2\" (UniqueName: \"kubernetes.io/projected/0352254e-d6b6-42ed-8c15-d44ce48e19a9-kube-api-access-mgwd2\") on node \"crc\" DevicePath \"\"" Mar 21 04:04:25 crc kubenswrapper[4685]: I0321 04:04:25.212606 4685 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0352254e-d6b6-42ed-8c15-d44ce48e19a9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 04:04:25 crc kubenswrapper[4685]: I0321 04:04:25.227596 4685 generic.go:334] "Generic (PLEG): container finished" podID="0352254e-d6b6-42ed-8c15-d44ce48e19a9" containerID="c37b5e87bd75595227b0130281ba74e7d714f94fffec0a69ac1ce4a9a214c1fc" exitCode=0 Mar 21 04:04:25 crc kubenswrapper[4685]: I0321 04:04:25.227634 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d88th" event={"ID":"0352254e-d6b6-42ed-8c15-d44ce48e19a9","Type":"ContainerDied","Data":"c37b5e87bd75595227b0130281ba74e7d714f94fffec0a69ac1ce4a9a214c1fc"} Mar 21 04:04:25 crc kubenswrapper[4685]: I0321 04:04:25.227676 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d88th" Mar 21 04:04:25 crc kubenswrapper[4685]: I0321 04:04:25.227695 4685 scope.go:117] "RemoveContainer" containerID="c37b5e87bd75595227b0130281ba74e7d714f94fffec0a69ac1ce4a9a214c1fc" Mar 21 04:04:25 crc kubenswrapper[4685]: I0321 04:04:25.227677 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d88th" event={"ID":"0352254e-d6b6-42ed-8c15-d44ce48e19a9","Type":"ContainerDied","Data":"d39d03cc095cb09398df2beb6f71650f5c2bf4eed1f9a3b7e70ab98b8052409d"} Mar 21 04:04:25 crc kubenswrapper[4685]: I0321 04:04:25.230128 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-db-sync-cdf4l" event={"ID":"8ff6ee16-ea1a-4725-b13c-ec201554a350","Type":"ContainerStarted","Data":"392130b085480d049ece401f37b5fc4b1563286ba1a5f0d2745577ac08d4d2f6"} Mar 21 04:04:25 crc kubenswrapper[4685]: I0321 04:04:25.261418 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d88th"] Mar 21 04:04:25 crc kubenswrapper[4685]: I0321 04:04:25.261848 4685 scope.go:117] "RemoveContainer" containerID="2d083082abe6921d0bd84cb0137ce2760d0a038e76bfbd8648175db07428b2fe" Mar 21 04:04:25 crc kubenswrapper[4685]: I0321 04:04:25.270364 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-d88th"] Mar 21 04:04:25 crc kubenswrapper[4685]: I0321 04:04:25.283412 4685 scope.go:117] "RemoveContainer" containerID="8ed1033c651dbe52868ae5e7a6467de0ccb242004d2c15f62b7ff199942c876a" Mar 21 04:04:25 crc kubenswrapper[4685]: I0321 04:04:25.298292 4685 scope.go:117] "RemoveContainer" containerID="c37b5e87bd75595227b0130281ba74e7d714f94fffec0a69ac1ce4a9a214c1fc" Mar 21 04:04:25 crc kubenswrapper[4685]: E0321 04:04:25.298702 4685 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c37b5e87bd75595227b0130281ba74e7d714f94fffec0a69ac1ce4a9a214c1fc\": container with ID starting with c37b5e87bd75595227b0130281ba74e7d714f94fffec0a69ac1ce4a9a214c1fc not found: ID does not exist" containerID="c37b5e87bd75595227b0130281ba74e7d714f94fffec0a69ac1ce4a9a214c1fc" Mar 21 04:04:25 crc kubenswrapper[4685]: I0321 04:04:25.298758 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c37b5e87bd75595227b0130281ba74e7d714f94fffec0a69ac1ce4a9a214c1fc"} err="failed to get container status \"c37b5e87bd75595227b0130281ba74e7d714f94fffec0a69ac1ce4a9a214c1fc\": rpc error: code = NotFound desc = could not find container \"c37b5e87bd75595227b0130281ba74e7d714f94fffec0a69ac1ce4a9a214c1fc\": container with ID starting with c37b5e87bd75595227b0130281ba74e7d714f94fffec0a69ac1ce4a9a214c1fc not found: ID does not exist" Mar 21 04:04:25 crc kubenswrapper[4685]: I0321 04:04:25.298789 4685 scope.go:117] "RemoveContainer" containerID="2d083082abe6921d0bd84cb0137ce2760d0a038e76bfbd8648175db07428b2fe" Mar 21 04:04:25 crc kubenswrapper[4685]: E0321 04:04:25.299184 4685 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d083082abe6921d0bd84cb0137ce2760d0a038e76bfbd8648175db07428b2fe\": container with ID starting with 2d083082abe6921d0bd84cb0137ce2760d0a038e76bfbd8648175db07428b2fe not found: ID does not exist" containerID="2d083082abe6921d0bd84cb0137ce2760d0a038e76bfbd8648175db07428b2fe" Mar 21 04:04:25 crc kubenswrapper[4685]: I0321 04:04:25.299209 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d083082abe6921d0bd84cb0137ce2760d0a038e76bfbd8648175db07428b2fe"} err="failed to get container status \"2d083082abe6921d0bd84cb0137ce2760d0a038e76bfbd8648175db07428b2fe\": rpc error: code = NotFound desc = could not find container \"2d083082abe6921d0bd84cb0137ce2760d0a038e76bfbd8648175db07428b2fe\": container with ID starting with 2d083082abe6921d0bd84cb0137ce2760d0a038e76bfbd8648175db07428b2fe not found: ID does not exist" Mar 21 04:04:25 crc kubenswrapper[4685]: I0321 04:04:25.299249 4685 scope.go:117] "RemoveContainer" containerID="8ed1033c651dbe52868ae5e7a6467de0ccb242004d2c15f62b7ff199942c876a" Mar 21 04:04:25 crc kubenswrapper[4685]: E0321 04:04:25.299644 4685 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ed1033c651dbe52868ae5e7a6467de0ccb242004d2c15f62b7ff199942c876a\": container with ID starting with 8ed1033c651dbe52868ae5e7a6467de0ccb242004d2c15f62b7ff199942c876a not found: ID does not exist" containerID="8ed1033c651dbe52868ae5e7a6467de0ccb242004d2c15f62b7ff199942c876a" Mar 21 04:04:25 crc kubenswrapper[4685]: I0321 04:04:25.299672 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ed1033c651dbe52868ae5e7a6467de0ccb242004d2c15f62b7ff199942c876a"} err="failed to get container status \"8ed1033c651dbe52868ae5e7a6467de0ccb242004d2c15f62b7ff199942c876a\": rpc error: code = NotFound desc = could not find container \"8ed1033c651dbe52868ae5e7a6467de0ccb242004d2c15f62b7ff199942c876a\": container with ID starting with 8ed1033c651dbe52868ae5e7a6467de0ccb242004d2c15f62b7ff199942c876a not found: ID does not exist" Mar 21 04:04:26 crc kubenswrapper[4685]: I0321 04:04:26.318290 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0352254e-d6b6-42ed-8c15-d44ce48e19a9" path="/var/lib/kubelet/pods/0352254e-d6b6-42ed-8c15-d44ce48e19a9/volumes" Mar 21 04:04:28 crc kubenswrapper[4685]: I0321 04:04:28.835586 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/barbican-operator-index-6hfc5"] Mar 21 04:04:29 crc kubenswrapper[4685]: I0321 04:04:29.646097 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-index-m4x5b"] Mar 21 04:04:29 crc kubenswrapper[4685]: E0321 04:04:29.646376 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0352254e-d6b6-42ed-8c15-d44ce48e19a9" containerName="registry-server" Mar 21 04:04:29 crc kubenswrapper[4685]: I0321 04:04:29.646388 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="0352254e-d6b6-42ed-8c15-d44ce48e19a9" containerName="registry-server" Mar 21 04:04:29 crc kubenswrapper[4685]: E0321 04:04:29.646403 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0352254e-d6b6-42ed-8c15-d44ce48e19a9" containerName="extract-utilities" Mar 21 04:04:29 crc kubenswrapper[4685]: I0321 04:04:29.646411 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="0352254e-d6b6-42ed-8c15-d44ce48e19a9" containerName="extract-utilities" Mar 21 04:04:29 crc kubenswrapper[4685]: E0321 04:04:29.646425 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0352254e-d6b6-42ed-8c15-d44ce48e19a9" containerName="extract-content" Mar 21 04:04:29 crc kubenswrapper[4685]: I0321 04:04:29.646432 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="0352254e-d6b6-42ed-8c15-d44ce48e19a9" containerName="extract-content" Mar 21 04:04:29 crc kubenswrapper[4685]: I0321 04:04:29.646573 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="0352254e-d6b6-42ed-8c15-d44ce48e19a9" containerName="registry-server" Mar 21 04:04:29 crc kubenswrapper[4685]: I0321 04:04:29.647076 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-index-m4x5b" Mar 21 04:04:29 crc kubenswrapper[4685]: I0321 04:04:29.652381 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-index-m4x5b"] Mar 21 04:04:29 crc kubenswrapper[4685]: I0321 04:04:29.719284 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npzkd\" (UniqueName: \"kubernetes.io/projected/08b81240-20f5-499a-afd2-5666d0fa97e3-kube-api-access-npzkd\") pod \"barbican-operator-index-m4x5b\" (UID: \"08b81240-20f5-499a-afd2-5666d0fa97e3\") " pod="openstack-operators/barbican-operator-index-m4x5b" Mar 21 04:04:29 crc kubenswrapper[4685]: I0321 04:04:29.820750 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npzkd\" (UniqueName: \"kubernetes.io/projected/08b81240-20f5-499a-afd2-5666d0fa97e3-kube-api-access-npzkd\") pod \"barbican-operator-index-m4x5b\" (UID: \"08b81240-20f5-499a-afd2-5666d0fa97e3\") " pod="openstack-operators/barbican-operator-index-m4x5b" Mar 21 04:04:29 crc kubenswrapper[4685]: I0321 04:04:29.841614 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npzkd\" (UniqueName: \"kubernetes.io/projected/08b81240-20f5-499a-afd2-5666d0fa97e3-kube-api-access-npzkd\") pod \"barbican-operator-index-m4x5b\" (UID: \"08b81240-20f5-499a-afd2-5666d0fa97e3\") " pod="openstack-operators/barbican-operator-index-m4x5b" Mar 21 04:04:29 crc kubenswrapper[4685]: I0321 04:04:29.978526 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-index-m4x5b" Mar 21 04:04:32 crc kubenswrapper[4685]: I0321 04:04:32.615555 4685 scope.go:117] "RemoveContainer" containerID="5fc29b3be8e21d56ea45ae6e5f1b216d3a227b41111f1fd30378634675fd304a" Mar 21 04:04:38 crc kubenswrapper[4685]: I0321 04:04:38.677878 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-index-m4x5b"] Mar 21 04:04:38 crc kubenswrapper[4685]: W0321 04:04:38.683365 4685 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08b81240_20f5_499a_afd2_5666d0fa97e3.slice/crio-f1091d9223c67499bfb71dfe286ed0918c22d46b2490ed927983f03f8a206990 WatchSource:0}: Error finding container f1091d9223c67499bfb71dfe286ed0918c22d46b2490ed927983f03f8a206990: Status 404 returned error can't find the container with id f1091d9223c67499bfb71dfe286ed0918c22d46b2490ed927983f03f8a206990 Mar 21 04:04:39 crc kubenswrapper[4685]: I0321 04:04:39.337862 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-index-m4x5b" event={"ID":"08b81240-20f5-499a-afd2-5666d0fa97e3","Type":"ContainerStarted","Data":"8096e48df5e824c20ac9ff8c44e5d00c0e44afcb986921a00eb143825932c6ef"} Mar 21 04:04:39 crc kubenswrapper[4685]: I0321 04:04:39.337902 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-index-m4x5b" event={"ID":"08b81240-20f5-499a-afd2-5666d0fa97e3","Type":"ContainerStarted","Data":"f1091d9223c67499bfb71dfe286ed0918c22d46b2490ed927983f03f8a206990"} Mar 21 04:04:39 crc kubenswrapper[4685]: I0321 04:04:39.339435 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-db-sync-cdf4l" event={"ID":"8ff6ee16-ea1a-4725-b13c-ec201554a350","Type":"ContainerStarted","Data":"cf8e3abcb5fdb58d7e4723134a52f88708af10f1a7caa07e15a0ed125f689048"} Mar 21 04:04:39 crc kubenswrapper[4685]: I0321 04:04:39.341315 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-index-6hfc5" event={"ID":"ab0d8ea3-4677-4647-afbf-5fa241927ff7","Type":"ContainerStarted","Data":"6e22cb06bb519eef791dad6506e2e553821ceb35768225d40628d2d0f7a60157"} Mar 21 04:04:39 crc kubenswrapper[4685]: I0321 04:04:39.341400 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/barbican-operator-index-6hfc5" podUID="ab0d8ea3-4677-4647-afbf-5fa241927ff7" containerName="registry-server" containerID="cri-o://6e22cb06bb519eef791dad6506e2e553821ceb35768225d40628d2d0f7a60157" gracePeriod=2 Mar 21 04:04:39 crc kubenswrapper[4685]: I0321 04:04:39.385405 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-index-m4x5b" podStartSLOduration=10.325342436 podStartE2EDuration="10.385389857s" podCreationTimestamp="2026-03-21 04:04:29 +0000 UTC" firstStartedPulling="2026-03-21 04:04:38.686630429 +0000 UTC m=+1111.163699221" lastFinishedPulling="2026-03-21 04:04:38.74667782 +0000 UTC m=+1111.223746642" observedRunningTime="2026-03-21 04:04:39.3594469 +0000 UTC m=+1111.836515692" watchObservedRunningTime="2026-03-21 04:04:39.385389857 +0000 UTC m=+1111.862458649" Mar 21 04:04:39 crc kubenswrapper[4685]: I0321 04:04:39.403252 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/keystone-db-sync-cdf4l" podStartSLOduration=2.34131595 podStartE2EDuration="16.403234406s" podCreationTimestamp="2026-03-21 04:04:23 +0000 UTC" firstStartedPulling="2026-03-21 04:04:24.267811598 +0000 UTC m=+1096.744880400" lastFinishedPulling="2026-03-21 04:04:38.329730064 +0000 UTC m=+1110.806798856" observedRunningTime="2026-03-21 04:04:39.387442904 +0000 UTC m=+1111.864511686" watchObservedRunningTime="2026-03-21 04:04:39.403234406 +0000 UTC m=+1111.880303198" Mar 21 04:04:39 crc kubenswrapper[4685]: I0321 04:04:39.404719 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-index-6hfc5" podStartSLOduration=1.925523691 podStartE2EDuration="16.404713128s" podCreationTimestamp="2026-03-21 04:04:23 +0000 UTC" firstStartedPulling="2026-03-21 04:04:23.786826753 +0000 UTC m=+1096.263895545" lastFinishedPulling="2026-03-21 04:04:38.26601616 +0000 UTC m=+1110.743084982" observedRunningTime="2026-03-21 04:04:39.400328025 +0000 UTC m=+1111.877396817" watchObservedRunningTime="2026-03-21 04:04:39.404713128 +0000 UTC m=+1111.881781910" Mar 21 04:04:39 crc kubenswrapper[4685]: I0321 04:04:39.698512 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-index-6hfc5" Mar 21 04:04:39 crc kubenswrapper[4685]: I0321 04:04:39.779434 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kj7gs\" (UniqueName: \"kubernetes.io/projected/ab0d8ea3-4677-4647-afbf-5fa241927ff7-kube-api-access-kj7gs\") pod \"ab0d8ea3-4677-4647-afbf-5fa241927ff7\" (UID: \"ab0d8ea3-4677-4647-afbf-5fa241927ff7\") " Mar 21 04:04:39 crc kubenswrapper[4685]: I0321 04:04:39.785456 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab0d8ea3-4677-4647-afbf-5fa241927ff7-kube-api-access-kj7gs" (OuterVolumeSpecName: "kube-api-access-kj7gs") pod "ab0d8ea3-4677-4647-afbf-5fa241927ff7" (UID: "ab0d8ea3-4677-4647-afbf-5fa241927ff7"). InnerVolumeSpecName "kube-api-access-kj7gs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:04:39 crc kubenswrapper[4685]: I0321 04:04:39.881235 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kj7gs\" (UniqueName: \"kubernetes.io/projected/ab0d8ea3-4677-4647-afbf-5fa241927ff7-kube-api-access-kj7gs\") on node \"crc\" DevicePath \"\"" Mar 21 04:04:39 crc kubenswrapper[4685]: I0321 04:04:39.979314 4685 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/barbican-operator-index-m4x5b" Mar 21 04:04:39 crc kubenswrapper[4685]: I0321 04:04:39.979371 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-index-m4x5b" Mar 21 04:04:40 crc kubenswrapper[4685]: I0321 04:04:40.007327 4685 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/barbican-operator-index-m4x5b" Mar 21 04:04:40 crc kubenswrapper[4685]: I0321 04:04:40.350899 4685 generic.go:334] "Generic (PLEG): container finished" podID="ab0d8ea3-4677-4647-afbf-5fa241927ff7" containerID="6e22cb06bb519eef791dad6506e2e553821ceb35768225d40628d2d0f7a60157" exitCode=0 Mar 21 04:04:40 crc kubenswrapper[4685]: I0321 04:04:40.350990 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-index-6hfc5" Mar 21 04:04:40 crc kubenswrapper[4685]: I0321 04:04:40.350975 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-index-6hfc5" event={"ID":"ab0d8ea3-4677-4647-afbf-5fa241927ff7","Type":"ContainerDied","Data":"6e22cb06bb519eef791dad6506e2e553821ceb35768225d40628d2d0f7a60157"} Mar 21 04:04:40 crc kubenswrapper[4685]: I0321 04:04:40.351297 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-index-6hfc5" event={"ID":"ab0d8ea3-4677-4647-afbf-5fa241927ff7","Type":"ContainerDied","Data":"3d41849ebef67437d34979e15b893a0e1b2358cbcbb03d787e9e348b25b6e23a"} Mar 21 04:04:40 crc kubenswrapper[4685]: I0321 04:04:40.351322 4685 scope.go:117] "RemoveContainer" containerID="6e22cb06bb519eef791dad6506e2e553821ceb35768225d40628d2d0f7a60157" Mar 21 04:04:40 crc kubenswrapper[4685]: I0321 04:04:40.375173 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/barbican-operator-index-6hfc5"] Mar 21 04:04:40 crc kubenswrapper[4685]: I0321 04:04:40.380110 4685 scope.go:117] "RemoveContainer" containerID="6e22cb06bb519eef791dad6506e2e553821ceb35768225d40628d2d0f7a60157" Mar 21 04:04:40 crc kubenswrapper[4685]: E0321 04:04:40.380733 4685 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e22cb06bb519eef791dad6506e2e553821ceb35768225d40628d2d0f7a60157\": container with ID starting with 6e22cb06bb519eef791dad6506e2e553821ceb35768225d40628d2d0f7a60157 not found: ID does not exist" containerID="6e22cb06bb519eef791dad6506e2e553821ceb35768225d40628d2d0f7a60157" Mar 21 04:04:40 crc kubenswrapper[4685]: I0321 04:04:40.380860 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e22cb06bb519eef791dad6506e2e553821ceb35768225d40628d2d0f7a60157"} err="failed to get container status \"6e22cb06bb519eef791dad6506e2e553821ceb35768225d40628d2d0f7a60157\": rpc error: code = NotFound desc = could not find container \"6e22cb06bb519eef791dad6506e2e553821ceb35768225d40628d2d0f7a60157\": container with ID starting with 6e22cb06bb519eef791dad6506e2e553821ceb35768225d40628d2d0f7a60157 not found: ID does not exist" Mar 21 04:04:40 crc kubenswrapper[4685]: I0321 04:04:40.382516 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/barbican-operator-index-6hfc5"] Mar 21 04:04:42 crc kubenswrapper[4685]: I0321 04:04:42.310074 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab0d8ea3-4677-4647-afbf-5fa241927ff7" path="/var/lib/kubelet/pods/ab0d8ea3-4677-4647-afbf-5fa241927ff7/volumes" Mar 21 04:04:43 crc kubenswrapper[4685]: I0321 04:04:43.378367 4685 generic.go:334] "Generic (PLEG): container finished" podID="8ff6ee16-ea1a-4725-b13c-ec201554a350" containerID="cf8e3abcb5fdb58d7e4723134a52f88708af10f1a7caa07e15a0ed125f689048" exitCode=0 Mar 21 04:04:43 crc kubenswrapper[4685]: I0321 04:04:43.378446 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-db-sync-cdf4l" event={"ID":"8ff6ee16-ea1a-4725-b13c-ec201554a350","Type":"ContainerDied","Data":"cf8e3abcb5fdb58d7e4723134a52f88708af10f1a7caa07e15a0ed125f689048"} Mar 21 04:04:44 crc kubenswrapper[4685]: I0321 04:04:44.684152 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-db-sync-cdf4l" Mar 21 04:04:44 crc kubenswrapper[4685]: I0321 04:04:44.843224 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ff6ee16-ea1a-4725-b13c-ec201554a350-config-data\") pod \"8ff6ee16-ea1a-4725-b13c-ec201554a350\" (UID: \"8ff6ee16-ea1a-4725-b13c-ec201554a350\") " Mar 21 04:04:44 crc kubenswrapper[4685]: I0321 04:04:44.843506 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jgkhb\" (UniqueName: \"kubernetes.io/projected/8ff6ee16-ea1a-4725-b13c-ec201554a350-kube-api-access-jgkhb\") pod \"8ff6ee16-ea1a-4725-b13c-ec201554a350\" (UID: \"8ff6ee16-ea1a-4725-b13c-ec201554a350\") " Mar 21 04:04:44 crc kubenswrapper[4685]: I0321 04:04:44.853534 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ff6ee16-ea1a-4725-b13c-ec201554a350-kube-api-access-jgkhb" (OuterVolumeSpecName: "kube-api-access-jgkhb") pod "8ff6ee16-ea1a-4725-b13c-ec201554a350" (UID: "8ff6ee16-ea1a-4725-b13c-ec201554a350"). InnerVolumeSpecName "kube-api-access-jgkhb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:04:44 crc kubenswrapper[4685]: I0321 04:04:44.875116 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ff6ee16-ea1a-4725-b13c-ec201554a350-config-data" (OuterVolumeSpecName: "config-data") pod "8ff6ee16-ea1a-4725-b13c-ec201554a350" (UID: "8ff6ee16-ea1a-4725-b13c-ec201554a350"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:04:44 crc kubenswrapper[4685]: I0321 04:04:44.945310 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jgkhb\" (UniqueName: \"kubernetes.io/projected/8ff6ee16-ea1a-4725-b13c-ec201554a350-kube-api-access-jgkhb\") on node \"crc\" DevicePath \"\"" Mar 21 04:04:44 crc kubenswrapper[4685]: I0321 04:04:44.945676 4685 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ff6ee16-ea1a-4725-b13c-ec201554a350-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:04:45 crc kubenswrapper[4685]: I0321 04:04:45.395141 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-db-sync-cdf4l" event={"ID":"8ff6ee16-ea1a-4725-b13c-ec201554a350","Type":"ContainerDied","Data":"392130b085480d049ece401f37b5fc4b1563286ba1a5f0d2745577ac08d4d2f6"} Mar 21 04:04:45 crc kubenswrapper[4685]: I0321 04:04:45.395217 4685 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="392130b085480d049ece401f37b5fc4b1563286ba1a5f0d2745577ac08d4d2f6" Mar 21 04:04:45 crc kubenswrapper[4685]: I0321 04:04:45.395232 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-db-sync-cdf4l" Mar 21 04:04:45 crc kubenswrapper[4685]: I0321 04:04:45.588459 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/keystone-bootstrap-dqllc"] Mar 21 04:04:45 crc kubenswrapper[4685]: E0321 04:04:45.588771 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ff6ee16-ea1a-4725-b13c-ec201554a350" containerName="keystone-db-sync" Mar 21 04:04:45 crc kubenswrapper[4685]: I0321 04:04:45.588791 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ff6ee16-ea1a-4725-b13c-ec201554a350" containerName="keystone-db-sync" Mar 21 04:04:45 crc kubenswrapper[4685]: E0321 04:04:45.588822 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab0d8ea3-4677-4647-afbf-5fa241927ff7" containerName="registry-server" Mar 21 04:04:45 crc kubenswrapper[4685]: I0321 04:04:45.588831 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab0d8ea3-4677-4647-afbf-5fa241927ff7" containerName="registry-server" Mar 21 04:04:45 crc kubenswrapper[4685]: I0321 04:04:45.589075 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ff6ee16-ea1a-4725-b13c-ec201554a350" containerName="keystone-db-sync" Mar 21 04:04:45 crc kubenswrapper[4685]: I0321 04:04:45.589091 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab0d8ea3-4677-4647-afbf-5fa241927ff7" containerName="registry-server" Mar 21 04:04:45 crc kubenswrapper[4685]: I0321 04:04:45.589606 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-bootstrap-dqllc" Mar 21 04:04:45 crc kubenswrapper[4685]: I0321 04:04:45.591335 4685 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"keystone-scripts" Mar 21 04:04:45 crc kubenswrapper[4685]: I0321 04:04:45.591819 4685 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"keystone" Mar 21 04:04:45 crc kubenswrapper[4685]: I0321 04:04:45.592049 4685 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"osp-secret" Mar 21 04:04:45 crc kubenswrapper[4685]: I0321 04:04:45.594644 4685 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"keystone-config-data" Mar 21 04:04:45 crc kubenswrapper[4685]: I0321 04:04:45.596337 4685 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"keystone-keystone-dockercfg-mvlf5" Mar 21 04:04:45 crc kubenswrapper[4685]: I0321 04:04:45.599637 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/keystone-bootstrap-dqllc"] Mar 21 04:04:45 crc kubenswrapper[4685]: I0321 04:04:45.757455 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03969ee5-9f6c-4a50-8a9f-fd651fe30da2-scripts\") pod \"keystone-bootstrap-dqllc\" (UID: \"03969ee5-9f6c-4a50-8a9f-fd651fe30da2\") " pod="barbican-kuttl-tests/keystone-bootstrap-dqllc" Mar 21 04:04:45 crc kubenswrapper[4685]: I0321 04:04:45.757739 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/03969ee5-9f6c-4a50-8a9f-fd651fe30da2-fernet-keys\") pod \"keystone-bootstrap-dqllc\" (UID: \"03969ee5-9f6c-4a50-8a9f-fd651fe30da2\") " pod="barbican-kuttl-tests/keystone-bootstrap-dqllc" Mar 21 04:04:45 crc kubenswrapper[4685]: I0321 04:04:45.757758 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7ctb\" (UniqueName: \"kubernetes.io/projected/03969ee5-9f6c-4a50-8a9f-fd651fe30da2-kube-api-access-l7ctb\") pod \"keystone-bootstrap-dqllc\" (UID: \"03969ee5-9f6c-4a50-8a9f-fd651fe30da2\") " pod="barbican-kuttl-tests/keystone-bootstrap-dqllc" Mar 21 04:04:45 crc kubenswrapper[4685]: I0321 04:04:45.757780 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/03969ee5-9f6c-4a50-8a9f-fd651fe30da2-credential-keys\") pod \"keystone-bootstrap-dqllc\" (UID: \"03969ee5-9f6c-4a50-8a9f-fd651fe30da2\") " pod="barbican-kuttl-tests/keystone-bootstrap-dqllc" Mar 21 04:04:45 crc kubenswrapper[4685]: I0321 04:04:45.757956 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03969ee5-9f6c-4a50-8a9f-fd651fe30da2-config-data\") pod \"keystone-bootstrap-dqllc\" (UID: \"03969ee5-9f6c-4a50-8a9f-fd651fe30da2\") " pod="barbican-kuttl-tests/keystone-bootstrap-dqllc" Mar 21 04:04:45 crc kubenswrapper[4685]: I0321 04:04:45.859286 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03969ee5-9f6c-4a50-8a9f-fd651fe30da2-scripts\") pod \"keystone-bootstrap-dqllc\" (UID: \"03969ee5-9f6c-4a50-8a9f-fd651fe30da2\") " pod="barbican-kuttl-tests/keystone-bootstrap-dqllc" Mar 21 04:04:45 crc kubenswrapper[4685]: I0321 04:04:45.859334 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/03969ee5-9f6c-4a50-8a9f-fd651fe30da2-fernet-keys\") pod \"keystone-bootstrap-dqllc\" (UID: \"03969ee5-9f6c-4a50-8a9f-fd651fe30da2\") " pod="barbican-kuttl-tests/keystone-bootstrap-dqllc" Mar 21 04:04:45 crc kubenswrapper[4685]: I0321 04:04:45.859352 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7ctb\" (UniqueName: \"kubernetes.io/projected/03969ee5-9f6c-4a50-8a9f-fd651fe30da2-kube-api-access-l7ctb\") pod \"keystone-bootstrap-dqllc\" (UID: \"03969ee5-9f6c-4a50-8a9f-fd651fe30da2\") " pod="barbican-kuttl-tests/keystone-bootstrap-dqllc" Mar 21 04:04:45 crc kubenswrapper[4685]: I0321 04:04:45.859375 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/03969ee5-9f6c-4a50-8a9f-fd651fe30da2-credential-keys\") pod \"keystone-bootstrap-dqllc\" (UID: \"03969ee5-9f6c-4a50-8a9f-fd651fe30da2\") " pod="barbican-kuttl-tests/keystone-bootstrap-dqllc" Mar 21 04:04:45 crc kubenswrapper[4685]: I0321 04:04:45.859411 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03969ee5-9f6c-4a50-8a9f-fd651fe30da2-config-data\") pod \"keystone-bootstrap-dqllc\" (UID: \"03969ee5-9f6c-4a50-8a9f-fd651fe30da2\") " pod="barbican-kuttl-tests/keystone-bootstrap-dqllc" Mar 21 04:04:45 crc kubenswrapper[4685]: I0321 04:04:45.863341 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/03969ee5-9f6c-4a50-8a9f-fd651fe30da2-credential-keys\") pod \"keystone-bootstrap-dqllc\" (UID: \"03969ee5-9f6c-4a50-8a9f-fd651fe30da2\") " pod="barbican-kuttl-tests/keystone-bootstrap-dqllc" Mar 21 04:04:45 crc kubenswrapper[4685]: I0321 04:04:45.863457 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03969ee5-9f6c-4a50-8a9f-fd651fe30da2-config-data\") pod \"keystone-bootstrap-dqllc\" (UID: \"03969ee5-9f6c-4a50-8a9f-fd651fe30da2\") " pod="barbican-kuttl-tests/keystone-bootstrap-dqllc" Mar 21 04:04:45 crc kubenswrapper[4685]: I0321 04:04:45.864209 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/03969ee5-9f6c-4a50-8a9f-fd651fe30da2-fernet-keys\") pod \"keystone-bootstrap-dqllc\" (UID: \"03969ee5-9f6c-4a50-8a9f-fd651fe30da2\") " pod="barbican-kuttl-tests/keystone-bootstrap-dqllc" Mar 21 04:04:45 crc kubenswrapper[4685]: I0321 04:04:45.864364 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03969ee5-9f6c-4a50-8a9f-fd651fe30da2-scripts\") pod \"keystone-bootstrap-dqllc\" (UID: \"03969ee5-9f6c-4a50-8a9f-fd651fe30da2\") " pod="barbican-kuttl-tests/keystone-bootstrap-dqllc" Mar 21 04:04:45 crc kubenswrapper[4685]: I0321 04:04:45.888066 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7ctb\" (UniqueName: \"kubernetes.io/projected/03969ee5-9f6c-4a50-8a9f-fd651fe30da2-kube-api-access-l7ctb\") pod \"keystone-bootstrap-dqllc\" (UID: \"03969ee5-9f6c-4a50-8a9f-fd651fe30da2\") " pod="barbican-kuttl-tests/keystone-bootstrap-dqllc" Mar 21 04:04:45 crc kubenswrapper[4685]: I0321 04:04:45.910491 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-bootstrap-dqllc" Mar 21 04:04:46 crc kubenswrapper[4685]: I0321 04:04:46.308607 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/keystone-bootstrap-dqllc"] Mar 21 04:04:46 crc kubenswrapper[4685]: I0321 04:04:46.402756 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-bootstrap-dqllc" event={"ID":"03969ee5-9f6c-4a50-8a9f-fd651fe30da2","Type":"ContainerStarted","Data":"b2f429f1ca01df3a0334d9b6528a2ed86547b4ad62a86d99e391f4f91cd8c642"} Mar 21 04:04:47 crc kubenswrapper[4685]: I0321 04:04:47.413071 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-bootstrap-dqllc" event={"ID":"03969ee5-9f6c-4a50-8a9f-fd651fe30da2","Type":"ContainerStarted","Data":"488bfae4e304e9216c77c046db670be2391ea0f1783a9db9181b2197faca2483"} Mar 21 04:04:47 crc kubenswrapper[4685]: I0321 04:04:47.439699 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/keystone-bootstrap-dqllc" podStartSLOduration=2.439679031 podStartE2EDuration="2.439679031s" podCreationTimestamp="2026-03-21 04:04:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:04:47.434366932 +0000 UTC m=+1119.911435724" watchObservedRunningTime="2026-03-21 04:04:47.439679031 +0000 UTC m=+1119.916747833" Mar 21 04:04:49 crc kubenswrapper[4685]: I0321 04:04:49.428619 4685 generic.go:334] "Generic (PLEG): container finished" podID="03969ee5-9f6c-4a50-8a9f-fd651fe30da2" containerID="488bfae4e304e9216c77c046db670be2391ea0f1783a9db9181b2197faca2483" exitCode=0 Mar 21 04:04:49 crc kubenswrapper[4685]: I0321 04:04:49.428718 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-bootstrap-dqllc" event={"ID":"03969ee5-9f6c-4a50-8a9f-fd651fe30da2","Type":"ContainerDied","Data":"488bfae4e304e9216c77c046db670be2391ea0f1783a9db9181b2197faca2483"} Mar 21 04:04:50 crc kubenswrapper[4685]: I0321 04:04:50.009265 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-index-m4x5b" Mar 21 04:04:50 crc kubenswrapper[4685]: I0321 04:04:50.720251 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-bootstrap-dqllc" Mar 21 04:04:50 crc kubenswrapper[4685]: I0321 04:04:50.830237 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03969ee5-9f6c-4a50-8a9f-fd651fe30da2-scripts\") pod \"03969ee5-9f6c-4a50-8a9f-fd651fe30da2\" (UID: \"03969ee5-9f6c-4a50-8a9f-fd651fe30da2\") " Mar 21 04:04:50 crc kubenswrapper[4685]: I0321 04:04:50.830306 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/03969ee5-9f6c-4a50-8a9f-fd651fe30da2-fernet-keys\") pod \"03969ee5-9f6c-4a50-8a9f-fd651fe30da2\" (UID: \"03969ee5-9f6c-4a50-8a9f-fd651fe30da2\") " Mar 21 04:04:50 crc kubenswrapper[4685]: I0321 04:04:50.830358 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7ctb\" (UniqueName: \"kubernetes.io/projected/03969ee5-9f6c-4a50-8a9f-fd651fe30da2-kube-api-access-l7ctb\") pod \"03969ee5-9f6c-4a50-8a9f-fd651fe30da2\" (UID: \"03969ee5-9f6c-4a50-8a9f-fd651fe30da2\") " Mar 21 04:04:50 crc kubenswrapper[4685]: I0321 04:04:50.830387 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/03969ee5-9f6c-4a50-8a9f-fd651fe30da2-credential-keys\") pod \"03969ee5-9f6c-4a50-8a9f-fd651fe30da2\" (UID: \"03969ee5-9f6c-4a50-8a9f-fd651fe30da2\") " Mar 21 04:04:50 crc kubenswrapper[4685]: I0321 04:04:50.830477 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03969ee5-9f6c-4a50-8a9f-fd651fe30da2-config-data\") pod \"03969ee5-9f6c-4a50-8a9f-fd651fe30da2\" (UID: \"03969ee5-9f6c-4a50-8a9f-fd651fe30da2\") " Mar 21 04:04:50 crc kubenswrapper[4685]: I0321 04:04:50.836135 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03969ee5-9f6c-4a50-8a9f-fd651fe30da2-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "03969ee5-9f6c-4a50-8a9f-fd651fe30da2" (UID: "03969ee5-9f6c-4a50-8a9f-fd651fe30da2"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:04:50 crc kubenswrapper[4685]: I0321 04:04:50.836641 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03969ee5-9f6c-4a50-8a9f-fd651fe30da2-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "03969ee5-9f6c-4a50-8a9f-fd651fe30da2" (UID: "03969ee5-9f6c-4a50-8a9f-fd651fe30da2"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:04:50 crc kubenswrapper[4685]: I0321 04:04:50.836739 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03969ee5-9f6c-4a50-8a9f-fd651fe30da2-kube-api-access-l7ctb" (OuterVolumeSpecName: "kube-api-access-l7ctb") pod "03969ee5-9f6c-4a50-8a9f-fd651fe30da2" (UID: "03969ee5-9f6c-4a50-8a9f-fd651fe30da2"). InnerVolumeSpecName "kube-api-access-l7ctb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:04:50 crc kubenswrapper[4685]: I0321 04:04:50.838356 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03969ee5-9f6c-4a50-8a9f-fd651fe30da2-scripts" (OuterVolumeSpecName: "scripts") pod "03969ee5-9f6c-4a50-8a9f-fd651fe30da2" (UID: "03969ee5-9f6c-4a50-8a9f-fd651fe30da2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:04:50 crc kubenswrapper[4685]: I0321 04:04:50.852638 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03969ee5-9f6c-4a50-8a9f-fd651fe30da2-config-data" (OuterVolumeSpecName: "config-data") pod "03969ee5-9f6c-4a50-8a9f-fd651fe30da2" (UID: "03969ee5-9f6c-4a50-8a9f-fd651fe30da2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:04:50 crc kubenswrapper[4685]: I0321 04:04:50.932023 4685 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03969ee5-9f6c-4a50-8a9f-fd651fe30da2-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:04:50 crc kubenswrapper[4685]: I0321 04:04:50.932051 4685 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03969ee5-9f6c-4a50-8a9f-fd651fe30da2-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:04:50 crc kubenswrapper[4685]: I0321 04:04:50.932060 4685 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/03969ee5-9f6c-4a50-8a9f-fd651fe30da2-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 21 04:04:50 crc kubenswrapper[4685]: I0321 04:04:50.932068 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l7ctb\" (UniqueName: \"kubernetes.io/projected/03969ee5-9f6c-4a50-8a9f-fd651fe30da2-kube-api-access-l7ctb\") on node \"crc\" DevicePath \"\"" Mar 21 04:04:50 crc kubenswrapper[4685]: I0321 04:04:50.932077 4685 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/03969ee5-9f6c-4a50-8a9f-fd651fe30da2-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 21 04:04:51 crc kubenswrapper[4685]: I0321 04:04:51.441441 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-bootstrap-dqllc" event={"ID":"03969ee5-9f6c-4a50-8a9f-fd651fe30da2","Type":"ContainerDied","Data":"b2f429f1ca01df3a0334d9b6528a2ed86547b4ad62a86d99e391f4f91cd8c642"} Mar 21 04:04:51 crc kubenswrapper[4685]: I0321 04:04:51.441481 4685 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2f429f1ca01df3a0334d9b6528a2ed86547b4ad62a86d99e391f4f91cd8c642" Mar 21 04:04:51 crc kubenswrapper[4685]: I0321 04:04:51.441525 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-bootstrap-dqllc" Mar 21 04:04:51 crc kubenswrapper[4685]: I0321 04:04:51.531755 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/keystone-8649f5b8f-mdp78"] Mar 21 04:04:51 crc kubenswrapper[4685]: E0321 04:04:51.532053 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03969ee5-9f6c-4a50-8a9f-fd651fe30da2" containerName="keystone-bootstrap" Mar 21 04:04:51 crc kubenswrapper[4685]: I0321 04:04:51.532069 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="03969ee5-9f6c-4a50-8a9f-fd651fe30da2" containerName="keystone-bootstrap" Mar 21 04:04:51 crc kubenswrapper[4685]: I0321 04:04:51.532187 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="03969ee5-9f6c-4a50-8a9f-fd651fe30da2" containerName="keystone-bootstrap" Mar 21 04:04:51 crc kubenswrapper[4685]: I0321 04:04:51.532589 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-8649f5b8f-mdp78" Mar 21 04:04:51 crc kubenswrapper[4685]: I0321 04:04:51.534208 4685 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"keystone-keystone-dockercfg-mvlf5" Mar 21 04:04:51 crc kubenswrapper[4685]: I0321 04:04:51.534995 4685 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"keystone-scripts" Mar 21 04:04:51 crc kubenswrapper[4685]: I0321 04:04:51.535020 4685 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"keystone-config-data" Mar 21 04:04:51 crc kubenswrapper[4685]: I0321 04:04:51.535778 4685 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"keystone" Mar 21 04:04:51 crc kubenswrapper[4685]: I0321 04:04:51.540076 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b2b4768-f546-4fad-9609-8b01fa7749dc-scripts\") pod \"keystone-8649f5b8f-mdp78\" (UID: \"0b2b4768-f546-4fad-9609-8b01fa7749dc\") " pod="barbican-kuttl-tests/keystone-8649f5b8f-mdp78" Mar 21 04:04:51 crc kubenswrapper[4685]: I0321 04:04:51.540166 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b2b4768-f546-4fad-9609-8b01fa7749dc-config-data\") pod \"keystone-8649f5b8f-mdp78\" (UID: \"0b2b4768-f546-4fad-9609-8b01fa7749dc\") " pod="barbican-kuttl-tests/keystone-8649f5b8f-mdp78" Mar 21 04:04:51 crc kubenswrapper[4685]: I0321 04:04:51.540279 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wzk7\" (UniqueName: \"kubernetes.io/projected/0b2b4768-f546-4fad-9609-8b01fa7749dc-kube-api-access-7wzk7\") pod \"keystone-8649f5b8f-mdp78\" (UID: \"0b2b4768-f546-4fad-9609-8b01fa7749dc\") " pod="barbican-kuttl-tests/keystone-8649f5b8f-mdp78" Mar 21 04:04:51 crc kubenswrapper[4685]: I0321 04:04:51.540340 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0b2b4768-f546-4fad-9609-8b01fa7749dc-fernet-keys\") pod \"keystone-8649f5b8f-mdp78\" (UID: \"0b2b4768-f546-4fad-9609-8b01fa7749dc\") " pod="barbican-kuttl-tests/keystone-8649f5b8f-mdp78" Mar 21 04:04:51 crc kubenswrapper[4685]: I0321 04:04:51.540386 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0b2b4768-f546-4fad-9609-8b01fa7749dc-credential-keys\") pod \"keystone-8649f5b8f-mdp78\" (UID: \"0b2b4768-f546-4fad-9609-8b01fa7749dc\") " pod="barbican-kuttl-tests/keystone-8649f5b8f-mdp78" Mar 21 04:04:51 crc kubenswrapper[4685]: I0321 04:04:51.542592 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/keystone-8649f5b8f-mdp78"] Mar 21 04:04:51 crc kubenswrapper[4685]: I0321 04:04:51.641189 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0b2b4768-f546-4fad-9609-8b01fa7749dc-fernet-keys\") pod \"keystone-8649f5b8f-mdp78\" (UID: \"0b2b4768-f546-4fad-9609-8b01fa7749dc\") " pod="barbican-kuttl-tests/keystone-8649f5b8f-mdp78" Mar 21 04:04:51 crc kubenswrapper[4685]: I0321 04:04:51.641231 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0b2b4768-f546-4fad-9609-8b01fa7749dc-credential-keys\") pod \"keystone-8649f5b8f-mdp78\" (UID: \"0b2b4768-f546-4fad-9609-8b01fa7749dc\") " pod="barbican-kuttl-tests/keystone-8649f5b8f-mdp78" Mar 21 04:04:51 crc kubenswrapper[4685]: I0321 04:04:51.641256 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b2b4768-f546-4fad-9609-8b01fa7749dc-scripts\") pod \"keystone-8649f5b8f-mdp78\" (UID: \"0b2b4768-f546-4fad-9609-8b01fa7749dc\") " pod="barbican-kuttl-tests/keystone-8649f5b8f-mdp78" Mar 21 04:04:51 crc kubenswrapper[4685]: I0321 04:04:51.641288 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b2b4768-f546-4fad-9609-8b01fa7749dc-config-data\") pod \"keystone-8649f5b8f-mdp78\" (UID: \"0b2b4768-f546-4fad-9609-8b01fa7749dc\") " pod="barbican-kuttl-tests/keystone-8649f5b8f-mdp78" Mar 21 04:04:51 crc kubenswrapper[4685]: I0321 04:04:51.641340 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wzk7\" (UniqueName: \"kubernetes.io/projected/0b2b4768-f546-4fad-9609-8b01fa7749dc-kube-api-access-7wzk7\") pod \"keystone-8649f5b8f-mdp78\" (UID: \"0b2b4768-f546-4fad-9609-8b01fa7749dc\") " pod="barbican-kuttl-tests/keystone-8649f5b8f-mdp78" Mar 21 04:04:51 crc kubenswrapper[4685]: I0321 04:04:51.646727 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0b2b4768-f546-4fad-9609-8b01fa7749dc-fernet-keys\") pod \"keystone-8649f5b8f-mdp78\" (UID: \"0b2b4768-f546-4fad-9609-8b01fa7749dc\") " pod="barbican-kuttl-tests/keystone-8649f5b8f-mdp78" Mar 21 04:04:51 crc kubenswrapper[4685]: I0321 04:04:51.650086 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b2b4768-f546-4fad-9609-8b01fa7749dc-scripts\") pod \"keystone-8649f5b8f-mdp78\" (UID: \"0b2b4768-f546-4fad-9609-8b01fa7749dc\") " pod="barbican-kuttl-tests/keystone-8649f5b8f-mdp78" Mar 21 04:04:51 crc kubenswrapper[4685]: I0321 04:04:51.650478 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0b2b4768-f546-4fad-9609-8b01fa7749dc-credential-keys\") pod \"keystone-8649f5b8f-mdp78\" (UID: \"0b2b4768-f546-4fad-9609-8b01fa7749dc\") " pod="barbican-kuttl-tests/keystone-8649f5b8f-mdp78" Mar 21 04:04:51 crc kubenswrapper[4685]: I0321 04:04:51.662596 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b2b4768-f546-4fad-9609-8b01fa7749dc-config-data\") pod \"keystone-8649f5b8f-mdp78\" (UID: \"0b2b4768-f546-4fad-9609-8b01fa7749dc\") " pod="barbican-kuttl-tests/keystone-8649f5b8f-mdp78" Mar 21 04:04:51 crc kubenswrapper[4685]: I0321 04:04:51.671620 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wzk7\" (UniqueName: \"kubernetes.io/projected/0b2b4768-f546-4fad-9609-8b01fa7749dc-kube-api-access-7wzk7\") pod \"keystone-8649f5b8f-mdp78\" (UID: \"0b2b4768-f546-4fad-9609-8b01fa7749dc\") " pod="barbican-kuttl-tests/keystone-8649f5b8f-mdp78" Mar 21 04:04:51 crc kubenswrapper[4685]: I0321 04:04:51.847445 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-8649f5b8f-mdp78" Mar 21 04:04:52 crc kubenswrapper[4685]: I0321 04:04:52.260404 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/keystone-8649f5b8f-mdp78"] Mar 21 04:04:52 crc kubenswrapper[4685]: I0321 04:04:52.451658 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-8649f5b8f-mdp78" event={"ID":"0b2b4768-f546-4fad-9609-8b01fa7749dc","Type":"ContainerStarted","Data":"22d93438459a6e12b158d3b08c3f4ece277363dc0de50210f3185acb8116f22d"} Mar 21 04:04:52 crc kubenswrapper[4685]: I0321 04:04:52.451721 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-8649f5b8f-mdp78" event={"ID":"0b2b4768-f546-4fad-9609-8b01fa7749dc","Type":"ContainerStarted","Data":"195a642aaffcc74f1c4f45bc57d66bac5c1dbd31403b1147e31b3cbf02f60cbd"} Mar 21 04:04:52 crc kubenswrapper[4685]: I0321 04:04:52.452813 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="barbican-kuttl-tests/keystone-8649f5b8f-mdp78" Mar 21 04:04:52 crc kubenswrapper[4685]: I0321 04:04:52.472881 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/keystone-8649f5b8f-mdp78" podStartSLOduration=1.472866676 podStartE2EDuration="1.472866676s" podCreationTimestamp="2026-03-21 04:04:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:04:52.46873094 +0000 UTC m=+1124.945799732" watchObservedRunningTime="2026-03-21 04:04:52.472866676 +0000 UTC m=+1124.949935468" Mar 21 04:04:55 crc kubenswrapper[4685]: I0321 04:04:55.896239 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/dee043e15d25f12961d0325068fc60bde20860efbe3e725c5a2edfb26ch99b5"] Mar 21 04:04:55 crc kubenswrapper[4685]: I0321 04:04:55.898400 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/dee043e15d25f12961d0325068fc60bde20860efbe3e725c5a2edfb26ch99b5" Mar 21 04:04:55 crc kubenswrapper[4685]: I0321 04:04:55.901115 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-2vwsz" Mar 21 04:04:55 crc kubenswrapper[4685]: I0321 04:04:55.915083 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/dee043e15d25f12961d0325068fc60bde20860efbe3e725c5a2edfb26ch99b5"] Mar 21 04:04:56 crc kubenswrapper[4685]: I0321 04:04:56.099461 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdssx\" (UniqueName: \"kubernetes.io/projected/a7d82eb2-9e3c-4a49-b3d5-3e66eb5f72d2-kube-api-access-jdssx\") pod \"dee043e15d25f12961d0325068fc60bde20860efbe3e725c5a2edfb26ch99b5\" (UID: \"a7d82eb2-9e3c-4a49-b3d5-3e66eb5f72d2\") " pod="openstack-operators/dee043e15d25f12961d0325068fc60bde20860efbe3e725c5a2edfb26ch99b5" Mar 21 04:04:56 crc kubenswrapper[4685]: I0321 04:04:56.099596 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a7d82eb2-9e3c-4a49-b3d5-3e66eb5f72d2-bundle\") pod \"dee043e15d25f12961d0325068fc60bde20860efbe3e725c5a2edfb26ch99b5\" (UID: \"a7d82eb2-9e3c-4a49-b3d5-3e66eb5f72d2\") " pod="openstack-operators/dee043e15d25f12961d0325068fc60bde20860efbe3e725c5a2edfb26ch99b5" Mar 21 04:04:56 crc kubenswrapper[4685]: I0321 04:04:56.099715 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a7d82eb2-9e3c-4a49-b3d5-3e66eb5f72d2-util\") pod \"dee043e15d25f12961d0325068fc60bde20860efbe3e725c5a2edfb26ch99b5\" (UID: \"a7d82eb2-9e3c-4a49-b3d5-3e66eb5f72d2\") " pod="openstack-operators/dee043e15d25f12961d0325068fc60bde20860efbe3e725c5a2edfb26ch99b5" Mar 21 04:04:56 crc kubenswrapper[4685]: I0321 04:04:56.201418 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdssx\" (UniqueName: \"kubernetes.io/projected/a7d82eb2-9e3c-4a49-b3d5-3e66eb5f72d2-kube-api-access-jdssx\") pod \"dee043e15d25f12961d0325068fc60bde20860efbe3e725c5a2edfb26ch99b5\" (UID: \"a7d82eb2-9e3c-4a49-b3d5-3e66eb5f72d2\") " pod="openstack-operators/dee043e15d25f12961d0325068fc60bde20860efbe3e725c5a2edfb26ch99b5" Mar 21 04:04:56 crc kubenswrapper[4685]: I0321 04:04:56.202004 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a7d82eb2-9e3c-4a49-b3d5-3e66eb5f72d2-bundle\") pod \"dee043e15d25f12961d0325068fc60bde20860efbe3e725c5a2edfb26ch99b5\" (UID: \"a7d82eb2-9e3c-4a49-b3d5-3e66eb5f72d2\") " pod="openstack-operators/dee043e15d25f12961d0325068fc60bde20860efbe3e725c5a2edfb26ch99b5" Mar 21 04:04:56 crc kubenswrapper[4685]: I0321 04:04:56.202131 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a7d82eb2-9e3c-4a49-b3d5-3e66eb5f72d2-util\") pod \"dee043e15d25f12961d0325068fc60bde20860efbe3e725c5a2edfb26ch99b5\" (UID: \"a7d82eb2-9e3c-4a49-b3d5-3e66eb5f72d2\") " pod="openstack-operators/dee043e15d25f12961d0325068fc60bde20860efbe3e725c5a2edfb26ch99b5" Mar 21 04:04:56 crc kubenswrapper[4685]: I0321 04:04:56.202677 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a7d82eb2-9e3c-4a49-b3d5-3e66eb5f72d2-bundle\") pod \"dee043e15d25f12961d0325068fc60bde20860efbe3e725c5a2edfb26ch99b5\" (UID: \"a7d82eb2-9e3c-4a49-b3d5-3e66eb5f72d2\") " pod="openstack-operators/dee043e15d25f12961d0325068fc60bde20860efbe3e725c5a2edfb26ch99b5" Mar 21 04:04:56 crc kubenswrapper[4685]: I0321 04:04:56.202978 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a7d82eb2-9e3c-4a49-b3d5-3e66eb5f72d2-util\") pod \"dee043e15d25f12961d0325068fc60bde20860efbe3e725c5a2edfb26ch99b5\" (UID: \"a7d82eb2-9e3c-4a49-b3d5-3e66eb5f72d2\") " pod="openstack-operators/dee043e15d25f12961d0325068fc60bde20860efbe3e725c5a2edfb26ch99b5" Mar 21 04:04:56 crc kubenswrapper[4685]: I0321 04:04:56.253345 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdssx\" (UniqueName: \"kubernetes.io/projected/a7d82eb2-9e3c-4a49-b3d5-3e66eb5f72d2-kube-api-access-jdssx\") pod \"dee043e15d25f12961d0325068fc60bde20860efbe3e725c5a2edfb26ch99b5\" (UID: \"a7d82eb2-9e3c-4a49-b3d5-3e66eb5f72d2\") " pod="openstack-operators/dee043e15d25f12961d0325068fc60bde20860efbe3e725c5a2edfb26ch99b5" Mar 21 04:04:56 crc kubenswrapper[4685]: I0321 04:04:56.523340 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/dee043e15d25f12961d0325068fc60bde20860efbe3e725c5a2edfb26ch99b5" Mar 21 04:04:57 crc kubenswrapper[4685]: I0321 04:04:57.141394 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/dee043e15d25f12961d0325068fc60bde20860efbe3e725c5a2edfb26ch99b5"] Mar 21 04:04:57 crc kubenswrapper[4685]: W0321 04:04:57.147073 4685 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7d82eb2_9e3c_4a49_b3d5_3e66eb5f72d2.slice/crio-1e6e2e046e63159452b85ed3fc53a47c8fa5b5c97f84369a660e7f7aa51d5710 WatchSource:0}: Error finding container 1e6e2e046e63159452b85ed3fc53a47c8fa5b5c97f84369a660e7f7aa51d5710: Status 404 returned error can't find the container with id 1e6e2e046e63159452b85ed3fc53a47c8fa5b5c97f84369a660e7f7aa51d5710 Mar 21 04:04:57 crc kubenswrapper[4685]: I0321 04:04:57.501435 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/dee043e15d25f12961d0325068fc60bde20860efbe3e725c5a2edfb26ch99b5" event={"ID":"a7d82eb2-9e3c-4a49-b3d5-3e66eb5f72d2","Type":"ContainerStarted","Data":"d60f5c559582dc45d7a37558330eb8efd4ecbb1054042c3729d4b402b10a4150"} Mar 21 04:04:57 crc kubenswrapper[4685]: I0321 04:04:57.501937 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/dee043e15d25f12961d0325068fc60bde20860efbe3e725c5a2edfb26ch99b5" event={"ID":"a7d82eb2-9e3c-4a49-b3d5-3e66eb5f72d2","Type":"ContainerStarted","Data":"1e6e2e046e63159452b85ed3fc53a47c8fa5b5c97f84369a660e7f7aa51d5710"} Mar 21 04:04:58 crc kubenswrapper[4685]: I0321 04:04:58.509433 4685 generic.go:334] "Generic (PLEG): container finished" podID="a7d82eb2-9e3c-4a49-b3d5-3e66eb5f72d2" containerID="d60f5c559582dc45d7a37558330eb8efd4ecbb1054042c3729d4b402b10a4150" exitCode=0 Mar 21 04:04:58 crc kubenswrapper[4685]: I0321 04:04:58.510074 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/dee043e15d25f12961d0325068fc60bde20860efbe3e725c5a2edfb26ch99b5" event={"ID":"a7d82eb2-9e3c-4a49-b3d5-3e66eb5f72d2","Type":"ContainerDied","Data":"d60f5c559582dc45d7a37558330eb8efd4ecbb1054042c3729d4b402b10a4150"} Mar 21 04:04:59 crc kubenswrapper[4685]: I0321 04:04:59.520445 4685 generic.go:334] "Generic (PLEG): container finished" podID="a7d82eb2-9e3c-4a49-b3d5-3e66eb5f72d2" containerID="d635250f0ee1eea21d79b1e0ca7e4f6e4057440a9515737b2f0614c19a9ce1fc" exitCode=0 Mar 21 04:04:59 crc kubenswrapper[4685]: I0321 04:04:59.520507 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/dee043e15d25f12961d0325068fc60bde20860efbe3e725c5a2edfb26ch99b5" event={"ID":"a7d82eb2-9e3c-4a49-b3d5-3e66eb5f72d2","Type":"ContainerDied","Data":"d635250f0ee1eea21d79b1e0ca7e4f6e4057440a9515737b2f0614c19a9ce1fc"} Mar 21 04:05:00 crc kubenswrapper[4685]: I0321 04:05:00.530421 4685 generic.go:334] "Generic (PLEG): container finished" podID="a7d82eb2-9e3c-4a49-b3d5-3e66eb5f72d2" containerID="504e1e37d72164f3df29405c996af83bddc68957778611fcb847936ee9327586" exitCode=0 Mar 21 04:05:00 crc kubenswrapper[4685]: I0321 04:05:00.530471 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/dee043e15d25f12961d0325068fc60bde20860efbe3e725c5a2edfb26ch99b5" event={"ID":"a7d82eb2-9e3c-4a49-b3d5-3e66eb5f72d2","Type":"ContainerDied","Data":"504e1e37d72164f3df29405c996af83bddc68957778611fcb847936ee9327586"} Mar 21 04:05:01 crc kubenswrapper[4685]: I0321 04:05:01.869436 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/dee043e15d25f12961d0325068fc60bde20860efbe3e725c5a2edfb26ch99b5" Mar 21 04:05:02 crc kubenswrapper[4685]: I0321 04:05:02.010734 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jdssx\" (UniqueName: \"kubernetes.io/projected/a7d82eb2-9e3c-4a49-b3d5-3e66eb5f72d2-kube-api-access-jdssx\") pod \"a7d82eb2-9e3c-4a49-b3d5-3e66eb5f72d2\" (UID: \"a7d82eb2-9e3c-4a49-b3d5-3e66eb5f72d2\") " Mar 21 04:05:02 crc kubenswrapper[4685]: I0321 04:05:02.010877 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a7d82eb2-9e3c-4a49-b3d5-3e66eb5f72d2-util\") pod \"a7d82eb2-9e3c-4a49-b3d5-3e66eb5f72d2\" (UID: \"a7d82eb2-9e3c-4a49-b3d5-3e66eb5f72d2\") " Mar 21 04:05:02 crc kubenswrapper[4685]: I0321 04:05:02.010906 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a7d82eb2-9e3c-4a49-b3d5-3e66eb5f72d2-bundle\") pod \"a7d82eb2-9e3c-4a49-b3d5-3e66eb5f72d2\" (UID: \"a7d82eb2-9e3c-4a49-b3d5-3e66eb5f72d2\") " Mar 21 04:05:02 crc kubenswrapper[4685]: I0321 04:05:02.012679 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7d82eb2-9e3c-4a49-b3d5-3e66eb5f72d2-bundle" (OuterVolumeSpecName: "bundle") pod "a7d82eb2-9e3c-4a49-b3d5-3e66eb5f72d2" (UID: "a7d82eb2-9e3c-4a49-b3d5-3e66eb5f72d2"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:05:02 crc kubenswrapper[4685]: I0321 04:05:02.020598 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7d82eb2-9e3c-4a49-b3d5-3e66eb5f72d2-kube-api-access-jdssx" (OuterVolumeSpecName: "kube-api-access-jdssx") pod "a7d82eb2-9e3c-4a49-b3d5-3e66eb5f72d2" (UID: "a7d82eb2-9e3c-4a49-b3d5-3e66eb5f72d2"). InnerVolumeSpecName "kube-api-access-jdssx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:05:02 crc kubenswrapper[4685]: I0321 04:05:02.040964 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7d82eb2-9e3c-4a49-b3d5-3e66eb5f72d2-util" (OuterVolumeSpecName: "util") pod "a7d82eb2-9e3c-4a49-b3d5-3e66eb5f72d2" (UID: "a7d82eb2-9e3c-4a49-b3d5-3e66eb5f72d2"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:05:02 crc kubenswrapper[4685]: I0321 04:05:02.112653 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jdssx\" (UniqueName: \"kubernetes.io/projected/a7d82eb2-9e3c-4a49-b3d5-3e66eb5f72d2-kube-api-access-jdssx\") on node \"crc\" DevicePath \"\"" Mar 21 04:05:02 crc kubenswrapper[4685]: I0321 04:05:02.113056 4685 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a7d82eb2-9e3c-4a49-b3d5-3e66eb5f72d2-util\") on node \"crc\" DevicePath \"\"" Mar 21 04:05:02 crc kubenswrapper[4685]: I0321 04:05:02.113141 4685 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a7d82eb2-9e3c-4a49-b3d5-3e66eb5f72d2-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:05:02 crc kubenswrapper[4685]: I0321 04:05:02.551271 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/dee043e15d25f12961d0325068fc60bde20860efbe3e725c5a2edfb26ch99b5" event={"ID":"a7d82eb2-9e3c-4a49-b3d5-3e66eb5f72d2","Type":"ContainerDied","Data":"1e6e2e046e63159452b85ed3fc53a47c8fa5b5c97f84369a660e7f7aa51d5710"} Mar 21 04:05:02 crc kubenswrapper[4685]: I0321 04:05:02.551339 4685 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e6e2e046e63159452b85ed3fc53a47c8fa5b5c97f84369a660e7f7aa51d5710" Mar 21 04:05:02 crc kubenswrapper[4685]: I0321 04:05:02.551370 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/dee043e15d25f12961d0325068fc60bde20860efbe3e725c5a2edfb26ch99b5" Mar 21 04:05:09 crc kubenswrapper[4685]: I0321 04:05:09.685092 4685 patch_prober.go:28] interesting pod/machine-config-daemon-7r9cg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 04:05:09 crc kubenswrapper[4685]: I0321 04:05:09.685602 4685 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" podUID="cea46fe2-4e41-43ab-a069-cb30fb4e732c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 04:05:23 crc kubenswrapper[4685]: I0321 04:05:23.358489 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="barbican-kuttl-tests/keystone-8649f5b8f-mdp78" Mar 21 04:05:27 crc kubenswrapper[4685]: I0321 04:05:27.318722 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-5589cf8c54-6qwrt"] Mar 21 04:05:27 crc kubenswrapper[4685]: E0321 04:05:27.321668 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7d82eb2-9e3c-4a49-b3d5-3e66eb5f72d2" containerName="pull" Mar 21 04:05:27 crc kubenswrapper[4685]: I0321 04:05:27.321767 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7d82eb2-9e3c-4a49-b3d5-3e66eb5f72d2" containerName="pull" Mar 21 04:05:27 crc kubenswrapper[4685]: E0321 04:05:27.321888 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7d82eb2-9e3c-4a49-b3d5-3e66eb5f72d2" containerName="extract" Mar 21 04:05:27 crc kubenswrapper[4685]: I0321 04:05:27.321972 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7d82eb2-9e3c-4a49-b3d5-3e66eb5f72d2" containerName="extract" Mar 21 04:05:27 crc kubenswrapper[4685]: E0321 04:05:27.322050 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7d82eb2-9e3c-4a49-b3d5-3e66eb5f72d2" containerName="util" Mar 21 04:05:27 crc kubenswrapper[4685]: I0321 04:05:27.322126 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7d82eb2-9e3c-4a49-b3d5-3e66eb5f72d2" containerName="util" Mar 21 04:05:27 crc kubenswrapper[4685]: I0321 04:05:27.322351 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7d82eb2-9e3c-4a49-b3d5-3e66eb5f72d2" containerName="extract" Mar 21 04:05:27 crc kubenswrapper[4685]: I0321 04:05:27.323247 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-5589cf8c54-6qwrt" Mar 21 04:05:27 crc kubenswrapper[4685]: I0321 04:05:27.325593 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-service-cert" Mar 21 04:05:27 crc kubenswrapper[4685]: I0321 04:05:27.326027 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-xncx2" Mar 21 04:05:27 crc kubenswrapper[4685]: I0321 04:05:27.350921 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-5589cf8c54-6qwrt"] Mar 21 04:05:27 crc kubenswrapper[4685]: I0321 04:05:27.410660 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/50509c19-c2fa-4171-a5f8-e4d699a9062c-webhook-cert\") pod \"barbican-operator-controller-manager-5589cf8c54-6qwrt\" (UID: \"50509c19-c2fa-4171-a5f8-e4d699a9062c\") " pod="openstack-operators/barbican-operator-controller-manager-5589cf8c54-6qwrt" Mar 21 04:05:27 crc kubenswrapper[4685]: I0321 04:05:27.410746 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/50509c19-c2fa-4171-a5f8-e4d699a9062c-apiservice-cert\") pod \"barbican-operator-controller-manager-5589cf8c54-6qwrt\" (UID: \"50509c19-c2fa-4171-a5f8-e4d699a9062c\") " pod="openstack-operators/barbican-operator-controller-manager-5589cf8c54-6qwrt" Mar 21 04:05:27 crc kubenswrapper[4685]: I0321 04:05:27.411065 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7c84\" (UniqueName: \"kubernetes.io/projected/50509c19-c2fa-4171-a5f8-e4d699a9062c-kube-api-access-l7c84\") pod \"barbican-operator-controller-manager-5589cf8c54-6qwrt\" (UID: \"50509c19-c2fa-4171-a5f8-e4d699a9062c\") " pod="openstack-operators/barbican-operator-controller-manager-5589cf8c54-6qwrt" Mar 21 04:05:27 crc kubenswrapper[4685]: I0321 04:05:27.512308 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/50509c19-c2fa-4171-a5f8-e4d699a9062c-apiservice-cert\") pod \"barbican-operator-controller-manager-5589cf8c54-6qwrt\" (UID: \"50509c19-c2fa-4171-a5f8-e4d699a9062c\") " pod="openstack-operators/barbican-operator-controller-manager-5589cf8c54-6qwrt" Mar 21 04:05:27 crc kubenswrapper[4685]: I0321 04:05:27.512422 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7c84\" (UniqueName: \"kubernetes.io/projected/50509c19-c2fa-4171-a5f8-e4d699a9062c-kube-api-access-l7c84\") pod \"barbican-operator-controller-manager-5589cf8c54-6qwrt\" (UID: \"50509c19-c2fa-4171-a5f8-e4d699a9062c\") " pod="openstack-operators/barbican-operator-controller-manager-5589cf8c54-6qwrt" Mar 21 04:05:27 crc kubenswrapper[4685]: I0321 04:05:27.512472 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/50509c19-c2fa-4171-a5f8-e4d699a9062c-webhook-cert\") pod \"barbican-operator-controller-manager-5589cf8c54-6qwrt\" (UID: \"50509c19-c2fa-4171-a5f8-e4d699a9062c\") " pod="openstack-operators/barbican-operator-controller-manager-5589cf8c54-6qwrt" Mar 21 04:05:27 crc kubenswrapper[4685]: I0321 04:05:27.517519 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/50509c19-c2fa-4171-a5f8-e4d699a9062c-apiservice-cert\") pod \"barbican-operator-controller-manager-5589cf8c54-6qwrt\" (UID: \"50509c19-c2fa-4171-a5f8-e4d699a9062c\") " pod="openstack-operators/barbican-operator-controller-manager-5589cf8c54-6qwrt" Mar 21 04:05:27 crc kubenswrapper[4685]: I0321 04:05:27.517622 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/50509c19-c2fa-4171-a5f8-e4d699a9062c-webhook-cert\") pod \"barbican-operator-controller-manager-5589cf8c54-6qwrt\" (UID: \"50509c19-c2fa-4171-a5f8-e4d699a9062c\") " pod="openstack-operators/barbican-operator-controller-manager-5589cf8c54-6qwrt" Mar 21 04:05:27 crc kubenswrapper[4685]: I0321 04:05:27.529869 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7c84\" (UniqueName: \"kubernetes.io/projected/50509c19-c2fa-4171-a5f8-e4d699a9062c-kube-api-access-l7c84\") pod \"barbican-operator-controller-manager-5589cf8c54-6qwrt\" (UID: \"50509c19-c2fa-4171-a5f8-e4d699a9062c\") " pod="openstack-operators/barbican-operator-controller-manager-5589cf8c54-6qwrt" Mar 21 04:05:27 crc kubenswrapper[4685]: I0321 04:05:27.645324 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-5589cf8c54-6qwrt" Mar 21 04:05:28 crc kubenswrapper[4685]: I0321 04:05:28.047374 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-5589cf8c54-6qwrt"] Mar 21 04:05:28 crc kubenswrapper[4685]: I0321 04:05:28.745218 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-5589cf8c54-6qwrt" event={"ID":"50509c19-c2fa-4171-a5f8-e4d699a9062c","Type":"ContainerStarted","Data":"fb8084639e3d5227e099fa993d8ddeee89b3980388b42ba56904a8ce72cbc9a0"} Mar 21 04:05:30 crc kubenswrapper[4685]: I0321 04:05:30.764569 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-5589cf8c54-6qwrt" event={"ID":"50509c19-c2fa-4171-a5f8-e4d699a9062c","Type":"ContainerStarted","Data":"cbbd1a2ce1003d134cae1726505335a0eaf118ab38ccb1008f2e26952fb34710"} Mar 21 04:05:30 crc kubenswrapper[4685]: I0321 04:05:30.764901 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-5589cf8c54-6qwrt" Mar 21 04:05:30 crc kubenswrapper[4685]: I0321 04:05:30.793946 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-5589cf8c54-6qwrt" podStartSLOduration=2.215184801 podStartE2EDuration="3.793915171s" podCreationTimestamp="2026-03-21 04:05:27 +0000 UTC" firstStartedPulling="2026-03-21 04:05:28.054992693 +0000 UTC m=+1160.532061485" lastFinishedPulling="2026-03-21 04:05:29.633723063 +0000 UTC m=+1162.110791855" observedRunningTime="2026-03-21 04:05:30.78923622 +0000 UTC m=+1163.266305102" watchObservedRunningTime="2026-03-21 04:05:30.793915171 +0000 UTC m=+1163.270984023" Mar 21 04:05:37 crc kubenswrapper[4685]: I0321 04:05:37.650522 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-5589cf8c54-6qwrt" Mar 21 04:05:39 crc kubenswrapper[4685]: I0321 04:05:39.165866 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/barbican-db-create-f7bbt"] Mar 21 04:05:39 crc kubenswrapper[4685]: I0321 04:05:39.167286 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-db-create-f7bbt" Mar 21 04:05:39 crc kubenswrapper[4685]: I0321 04:05:39.170615 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-db-create-f7bbt"] Mar 21 04:05:39 crc kubenswrapper[4685]: I0321 04:05:39.268483 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/barbican-1eeb-account-create-update-p8r2p"] Mar 21 04:05:39 crc kubenswrapper[4685]: I0321 04:05:39.269236 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-1eeb-account-create-update-p8r2p" Mar 21 04:05:39 crc kubenswrapper[4685]: I0321 04:05:39.271759 4685 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"barbican-db-secret" Mar 21 04:05:39 crc kubenswrapper[4685]: I0321 04:05:39.274066 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c914374a-7747-4f27-aeee-b8a64331159a-operator-scripts\") pod \"barbican-db-create-f7bbt\" (UID: \"c914374a-7747-4f27-aeee-b8a64331159a\") " pod="barbican-kuttl-tests/barbican-db-create-f7bbt" Mar 21 04:05:39 crc kubenswrapper[4685]: I0321 04:05:39.274154 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8zm5\" (UniqueName: \"kubernetes.io/projected/c914374a-7747-4f27-aeee-b8a64331159a-kube-api-access-h8zm5\") pod \"barbican-db-create-f7bbt\" (UID: \"c914374a-7747-4f27-aeee-b8a64331159a\") " pod="barbican-kuttl-tests/barbican-db-create-f7bbt" Mar 21 04:05:39 crc kubenswrapper[4685]: I0321 04:05:39.290075 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-1eeb-account-create-update-p8r2p"] Mar 21 04:05:39 crc kubenswrapper[4685]: I0321 04:05:39.375437 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwm6k\" (UniqueName: \"kubernetes.io/projected/70c7121a-c5b2-4cf2-83bf-208ca926520a-kube-api-access-mwm6k\") pod \"barbican-1eeb-account-create-update-p8r2p\" (UID: \"70c7121a-c5b2-4cf2-83bf-208ca926520a\") " pod="barbican-kuttl-tests/barbican-1eeb-account-create-update-p8r2p" Mar 21 04:05:39 crc kubenswrapper[4685]: I0321 04:05:39.375715 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8zm5\" (UniqueName: \"kubernetes.io/projected/c914374a-7747-4f27-aeee-b8a64331159a-kube-api-access-h8zm5\") pod \"barbican-db-create-f7bbt\" (UID: \"c914374a-7747-4f27-aeee-b8a64331159a\") " pod="barbican-kuttl-tests/barbican-db-create-f7bbt" Mar 21 04:05:39 crc kubenswrapper[4685]: I0321 04:05:39.375874 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c914374a-7747-4f27-aeee-b8a64331159a-operator-scripts\") pod \"barbican-db-create-f7bbt\" (UID: \"c914374a-7747-4f27-aeee-b8a64331159a\") " pod="barbican-kuttl-tests/barbican-db-create-f7bbt" Mar 21 04:05:39 crc kubenswrapper[4685]: I0321 04:05:39.375911 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/70c7121a-c5b2-4cf2-83bf-208ca926520a-operator-scripts\") pod \"barbican-1eeb-account-create-update-p8r2p\" (UID: \"70c7121a-c5b2-4cf2-83bf-208ca926520a\") " pod="barbican-kuttl-tests/barbican-1eeb-account-create-update-p8r2p" Mar 21 04:05:39 crc kubenswrapper[4685]: I0321 04:05:39.376584 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c914374a-7747-4f27-aeee-b8a64331159a-operator-scripts\") pod \"barbican-db-create-f7bbt\" (UID: \"c914374a-7747-4f27-aeee-b8a64331159a\") " pod="barbican-kuttl-tests/barbican-db-create-f7bbt" Mar 21 04:05:39 crc kubenswrapper[4685]: I0321 04:05:39.395519 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8zm5\" (UniqueName: \"kubernetes.io/projected/c914374a-7747-4f27-aeee-b8a64331159a-kube-api-access-h8zm5\") pod \"barbican-db-create-f7bbt\" (UID: \"c914374a-7747-4f27-aeee-b8a64331159a\") " pod="barbican-kuttl-tests/barbican-db-create-f7bbt" Mar 21 04:05:39 crc kubenswrapper[4685]: I0321 04:05:39.477070 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/70c7121a-c5b2-4cf2-83bf-208ca926520a-operator-scripts\") pod \"barbican-1eeb-account-create-update-p8r2p\" (UID: \"70c7121a-c5b2-4cf2-83bf-208ca926520a\") " pod="barbican-kuttl-tests/barbican-1eeb-account-create-update-p8r2p" Mar 21 04:05:39 crc kubenswrapper[4685]: I0321 04:05:39.477142 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwm6k\" (UniqueName: \"kubernetes.io/projected/70c7121a-c5b2-4cf2-83bf-208ca926520a-kube-api-access-mwm6k\") pod \"barbican-1eeb-account-create-update-p8r2p\" (UID: \"70c7121a-c5b2-4cf2-83bf-208ca926520a\") " pod="barbican-kuttl-tests/barbican-1eeb-account-create-update-p8r2p" Mar 21 04:05:39 crc kubenswrapper[4685]: I0321 04:05:39.478029 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/70c7121a-c5b2-4cf2-83bf-208ca926520a-operator-scripts\") pod \"barbican-1eeb-account-create-update-p8r2p\" (UID: \"70c7121a-c5b2-4cf2-83bf-208ca926520a\") " pod="barbican-kuttl-tests/barbican-1eeb-account-create-update-p8r2p" Mar 21 04:05:39 crc kubenswrapper[4685]: I0321 04:05:39.483533 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-db-create-f7bbt" Mar 21 04:05:39 crc kubenswrapper[4685]: I0321 04:05:39.495954 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwm6k\" (UniqueName: \"kubernetes.io/projected/70c7121a-c5b2-4cf2-83bf-208ca926520a-kube-api-access-mwm6k\") pod \"barbican-1eeb-account-create-update-p8r2p\" (UID: \"70c7121a-c5b2-4cf2-83bf-208ca926520a\") " pod="barbican-kuttl-tests/barbican-1eeb-account-create-update-p8r2p" Mar 21 04:05:39 crc kubenswrapper[4685]: I0321 04:05:39.585622 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-1eeb-account-create-update-p8r2p" Mar 21 04:05:39 crc kubenswrapper[4685]: I0321 04:05:39.685577 4685 patch_prober.go:28] interesting pod/machine-config-daemon-7r9cg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 04:05:39 crc kubenswrapper[4685]: I0321 04:05:39.685644 4685 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" podUID="cea46fe2-4e41-43ab-a069-cb30fb4e732c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 04:05:39 crc kubenswrapper[4685]: I0321 04:05:39.755640 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-db-create-f7bbt"] Mar 21 04:05:39 crc kubenswrapper[4685]: I0321 04:05:39.838827 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-db-create-f7bbt" event={"ID":"c914374a-7747-4f27-aeee-b8a64331159a","Type":"ContainerStarted","Data":"8da0c90c68d00b3591be202d706b79035ad2261b33b1f1dda4141988ad418b4d"} Mar 21 04:05:40 crc kubenswrapper[4685]: I0321 04:05:40.170314 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-1eeb-account-create-update-p8r2p"] Mar 21 04:05:40 crc kubenswrapper[4685]: I0321 04:05:40.848044 4685 generic.go:334] "Generic (PLEG): container finished" podID="70c7121a-c5b2-4cf2-83bf-208ca926520a" containerID="8101f771393f4477ed3b0386624ce42ba8be023cfc7d057e7b79e3f15820014f" exitCode=0 Mar 21 04:05:40 crc kubenswrapper[4685]: I0321 04:05:40.848326 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-1eeb-account-create-update-p8r2p" event={"ID":"70c7121a-c5b2-4cf2-83bf-208ca926520a","Type":"ContainerDied","Data":"8101f771393f4477ed3b0386624ce42ba8be023cfc7d057e7b79e3f15820014f"} Mar 21 04:05:40 crc kubenswrapper[4685]: I0321 04:05:40.848352 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-1eeb-account-create-update-p8r2p" event={"ID":"70c7121a-c5b2-4cf2-83bf-208ca926520a","Type":"ContainerStarted","Data":"a292e721aff610c8706c888a253075a3a5d03dffc1daebec5b065c569cf7633f"} Mar 21 04:05:40 crc kubenswrapper[4685]: I0321 04:05:40.850157 4685 generic.go:334] "Generic (PLEG): container finished" podID="c914374a-7747-4f27-aeee-b8a64331159a" containerID="a119c9da6267126198e8b7fd6119c4f5f090b32d83a339841698fe6376d4c2ac" exitCode=0 Mar 21 04:05:40 crc kubenswrapper[4685]: I0321 04:05:40.850194 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-db-create-f7bbt" event={"ID":"c914374a-7747-4f27-aeee-b8a64331159a","Type":"ContainerDied","Data":"a119c9da6267126198e8b7fd6119c4f5f090b32d83a339841698fe6376d4c2ac"} Mar 21 04:05:42 crc kubenswrapper[4685]: I0321 04:05:42.180271 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-db-create-f7bbt" Mar 21 04:05:42 crc kubenswrapper[4685]: I0321 04:05:42.183979 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-1eeb-account-create-update-p8r2p" Mar 21 04:05:42 crc kubenswrapper[4685]: I0321 04:05:42.325220 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c914374a-7747-4f27-aeee-b8a64331159a-operator-scripts\") pod \"c914374a-7747-4f27-aeee-b8a64331159a\" (UID: \"c914374a-7747-4f27-aeee-b8a64331159a\") " Mar 21 04:05:42 crc kubenswrapper[4685]: I0321 04:05:42.325275 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mwm6k\" (UniqueName: \"kubernetes.io/projected/70c7121a-c5b2-4cf2-83bf-208ca926520a-kube-api-access-mwm6k\") pod \"70c7121a-c5b2-4cf2-83bf-208ca926520a\" (UID: \"70c7121a-c5b2-4cf2-83bf-208ca926520a\") " Mar 21 04:05:42 crc kubenswrapper[4685]: I0321 04:05:42.325400 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/70c7121a-c5b2-4cf2-83bf-208ca926520a-operator-scripts\") pod \"70c7121a-c5b2-4cf2-83bf-208ca926520a\" (UID: \"70c7121a-c5b2-4cf2-83bf-208ca926520a\") " Mar 21 04:05:42 crc kubenswrapper[4685]: I0321 04:05:42.325506 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8zm5\" (UniqueName: \"kubernetes.io/projected/c914374a-7747-4f27-aeee-b8a64331159a-kube-api-access-h8zm5\") pod \"c914374a-7747-4f27-aeee-b8a64331159a\" (UID: \"c914374a-7747-4f27-aeee-b8a64331159a\") " Mar 21 04:05:42 crc kubenswrapper[4685]: I0321 04:05:42.326127 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c914374a-7747-4f27-aeee-b8a64331159a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c914374a-7747-4f27-aeee-b8a64331159a" (UID: "c914374a-7747-4f27-aeee-b8a64331159a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:05:42 crc kubenswrapper[4685]: I0321 04:05:42.326129 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70c7121a-c5b2-4cf2-83bf-208ca926520a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "70c7121a-c5b2-4cf2-83bf-208ca926520a" (UID: "70c7121a-c5b2-4cf2-83bf-208ca926520a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:05:42 crc kubenswrapper[4685]: I0321 04:05:42.331511 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70c7121a-c5b2-4cf2-83bf-208ca926520a-kube-api-access-mwm6k" (OuterVolumeSpecName: "kube-api-access-mwm6k") pod "70c7121a-c5b2-4cf2-83bf-208ca926520a" (UID: "70c7121a-c5b2-4cf2-83bf-208ca926520a"). InnerVolumeSpecName "kube-api-access-mwm6k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:05:42 crc kubenswrapper[4685]: I0321 04:05:42.345064 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c914374a-7747-4f27-aeee-b8a64331159a-kube-api-access-h8zm5" (OuterVolumeSpecName: "kube-api-access-h8zm5") pod "c914374a-7747-4f27-aeee-b8a64331159a" (UID: "c914374a-7747-4f27-aeee-b8a64331159a"). InnerVolumeSpecName "kube-api-access-h8zm5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:05:42 crc kubenswrapper[4685]: I0321 04:05:42.427071 4685 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/70c7121a-c5b2-4cf2-83bf-208ca926520a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:05:42 crc kubenswrapper[4685]: I0321 04:05:42.427125 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h8zm5\" (UniqueName: \"kubernetes.io/projected/c914374a-7747-4f27-aeee-b8a64331159a-kube-api-access-h8zm5\") on node \"crc\" DevicePath \"\"" Mar 21 04:05:42 crc kubenswrapper[4685]: I0321 04:05:42.427142 4685 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c914374a-7747-4f27-aeee-b8a64331159a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:05:42 crc kubenswrapper[4685]: I0321 04:05:42.427248 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mwm6k\" (UniqueName: \"kubernetes.io/projected/70c7121a-c5b2-4cf2-83bf-208ca926520a-kube-api-access-mwm6k\") on node \"crc\" DevicePath \"\"" Mar 21 04:05:42 crc kubenswrapper[4685]: I0321 04:05:42.870023 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-db-create-f7bbt" event={"ID":"c914374a-7747-4f27-aeee-b8a64331159a","Type":"ContainerDied","Data":"8da0c90c68d00b3591be202d706b79035ad2261b33b1f1dda4141988ad418b4d"} Mar 21 04:05:42 crc kubenswrapper[4685]: I0321 04:05:42.870065 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-db-create-f7bbt" Mar 21 04:05:42 crc kubenswrapper[4685]: I0321 04:05:42.870077 4685 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8da0c90c68d00b3591be202d706b79035ad2261b33b1f1dda4141988ad418b4d" Mar 21 04:05:42 crc kubenswrapper[4685]: I0321 04:05:42.871975 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-1eeb-account-create-update-p8r2p" event={"ID":"70c7121a-c5b2-4cf2-83bf-208ca926520a","Type":"ContainerDied","Data":"a292e721aff610c8706c888a253075a3a5d03dffc1daebec5b065c569cf7633f"} Mar 21 04:05:42 crc kubenswrapper[4685]: I0321 04:05:42.872005 4685 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a292e721aff610c8706c888a253075a3a5d03dffc1daebec5b065c569cf7633f" Mar 21 04:05:42 crc kubenswrapper[4685]: I0321 04:05:42.872146 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-1eeb-account-create-update-p8r2p" Mar 21 04:05:44 crc kubenswrapper[4685]: I0321 04:05:44.636980 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/barbican-db-sync-jgnjz"] Mar 21 04:05:44 crc kubenswrapper[4685]: E0321 04:05:44.637805 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c914374a-7747-4f27-aeee-b8a64331159a" containerName="mariadb-database-create" Mar 21 04:05:44 crc kubenswrapper[4685]: I0321 04:05:44.637822 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="c914374a-7747-4f27-aeee-b8a64331159a" containerName="mariadb-database-create" Mar 21 04:05:44 crc kubenswrapper[4685]: E0321 04:05:44.637870 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70c7121a-c5b2-4cf2-83bf-208ca926520a" containerName="mariadb-account-create-update" Mar 21 04:05:44 crc kubenswrapper[4685]: I0321 04:05:44.637879 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="70c7121a-c5b2-4cf2-83bf-208ca926520a" containerName="mariadb-account-create-update" Mar 21 04:05:44 crc kubenswrapper[4685]: I0321 04:05:44.638019 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="70c7121a-c5b2-4cf2-83bf-208ca926520a" containerName="mariadb-account-create-update" Mar 21 04:05:44 crc kubenswrapper[4685]: I0321 04:05:44.638035 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="c914374a-7747-4f27-aeee-b8a64331159a" containerName="mariadb-database-create" Mar 21 04:05:44 crc kubenswrapper[4685]: I0321 04:05:44.638595 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-db-sync-jgnjz" Mar 21 04:05:44 crc kubenswrapper[4685]: I0321 04:05:44.643538 4685 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"barbican-barbican-dockercfg-n2t7f" Mar 21 04:05:44 crc kubenswrapper[4685]: I0321 04:05:44.644734 4685 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"barbican-config-data" Mar 21 04:05:44 crc kubenswrapper[4685]: I0321 04:05:44.654850 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-db-sync-jgnjz"] Mar 21 04:05:44 crc kubenswrapper[4685]: I0321 04:05:44.680273 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ef3ba9cc-12c1-4e77-9d4e-ef59eff1ad6e-db-sync-config-data\") pod \"barbican-db-sync-jgnjz\" (UID: \"ef3ba9cc-12c1-4e77-9d4e-ef59eff1ad6e\") " pod="barbican-kuttl-tests/barbican-db-sync-jgnjz" Mar 21 04:05:44 crc kubenswrapper[4685]: I0321 04:05:44.680407 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2f6l\" (UniqueName: \"kubernetes.io/projected/ef3ba9cc-12c1-4e77-9d4e-ef59eff1ad6e-kube-api-access-l2f6l\") pod \"barbican-db-sync-jgnjz\" (UID: \"ef3ba9cc-12c1-4e77-9d4e-ef59eff1ad6e\") " pod="barbican-kuttl-tests/barbican-db-sync-jgnjz" Mar 21 04:05:44 crc kubenswrapper[4685]: I0321 04:05:44.780956 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ef3ba9cc-12c1-4e77-9d4e-ef59eff1ad6e-db-sync-config-data\") pod \"barbican-db-sync-jgnjz\" (UID: \"ef3ba9cc-12c1-4e77-9d4e-ef59eff1ad6e\") " pod="barbican-kuttl-tests/barbican-db-sync-jgnjz" Mar 21 04:05:44 crc kubenswrapper[4685]: I0321 04:05:44.781462 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2f6l\" (UniqueName: \"kubernetes.io/projected/ef3ba9cc-12c1-4e77-9d4e-ef59eff1ad6e-kube-api-access-l2f6l\") pod \"barbican-db-sync-jgnjz\" (UID: \"ef3ba9cc-12c1-4e77-9d4e-ef59eff1ad6e\") " pod="barbican-kuttl-tests/barbican-db-sync-jgnjz" Mar 21 04:05:44 crc kubenswrapper[4685]: I0321 04:05:44.787669 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ef3ba9cc-12c1-4e77-9d4e-ef59eff1ad6e-db-sync-config-data\") pod \"barbican-db-sync-jgnjz\" (UID: \"ef3ba9cc-12c1-4e77-9d4e-ef59eff1ad6e\") " pod="barbican-kuttl-tests/barbican-db-sync-jgnjz" Mar 21 04:05:44 crc kubenswrapper[4685]: I0321 04:05:44.801877 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2f6l\" (UniqueName: \"kubernetes.io/projected/ef3ba9cc-12c1-4e77-9d4e-ef59eff1ad6e-kube-api-access-l2f6l\") pod \"barbican-db-sync-jgnjz\" (UID: \"ef3ba9cc-12c1-4e77-9d4e-ef59eff1ad6e\") " pod="barbican-kuttl-tests/barbican-db-sync-jgnjz" Mar 21 04:05:44 crc kubenswrapper[4685]: I0321 04:05:44.991210 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-db-sync-jgnjz" Mar 21 04:05:45 crc kubenswrapper[4685]: I0321 04:05:45.391419 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-db-sync-jgnjz"] Mar 21 04:05:45 crc kubenswrapper[4685]: I0321 04:05:45.894157 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-db-sync-jgnjz" event={"ID":"ef3ba9cc-12c1-4e77-9d4e-ef59eff1ad6e","Type":"ContainerStarted","Data":"08cab9bebf6725001848795dbe34a2712d7630cb38f69bf9840867b8bf47d49d"} Mar 21 04:05:49 crc kubenswrapper[4685]: I0321 04:05:49.921485 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-db-sync-jgnjz" event={"ID":"ef3ba9cc-12c1-4e77-9d4e-ef59eff1ad6e","Type":"ContainerStarted","Data":"5a5861306173fdcbba6d612a677946fced4cd4857b9630c2deab3b98569c2b37"} Mar 21 04:05:54 crc kubenswrapper[4685]: I0321 04:05:54.959515 4685 generic.go:334] "Generic (PLEG): container finished" podID="ef3ba9cc-12c1-4e77-9d4e-ef59eff1ad6e" containerID="5a5861306173fdcbba6d612a677946fced4cd4857b9630c2deab3b98569c2b37" exitCode=0 Mar 21 04:05:54 crc kubenswrapper[4685]: I0321 04:05:54.959640 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-db-sync-jgnjz" event={"ID":"ef3ba9cc-12c1-4e77-9d4e-ef59eff1ad6e","Type":"ContainerDied","Data":"5a5861306173fdcbba6d612a677946fced4cd4857b9630c2deab3b98569c2b37"} Mar 21 04:05:56 crc kubenswrapper[4685]: I0321 04:05:56.255050 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-db-sync-jgnjz" Mar 21 04:05:56 crc kubenswrapper[4685]: I0321 04:05:56.447157 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ef3ba9cc-12c1-4e77-9d4e-ef59eff1ad6e-db-sync-config-data\") pod \"ef3ba9cc-12c1-4e77-9d4e-ef59eff1ad6e\" (UID: \"ef3ba9cc-12c1-4e77-9d4e-ef59eff1ad6e\") " Mar 21 04:05:56 crc kubenswrapper[4685]: I0321 04:05:56.447300 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2f6l\" (UniqueName: \"kubernetes.io/projected/ef3ba9cc-12c1-4e77-9d4e-ef59eff1ad6e-kube-api-access-l2f6l\") pod \"ef3ba9cc-12c1-4e77-9d4e-ef59eff1ad6e\" (UID: \"ef3ba9cc-12c1-4e77-9d4e-ef59eff1ad6e\") " Mar 21 04:05:56 crc kubenswrapper[4685]: I0321 04:05:56.453484 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef3ba9cc-12c1-4e77-9d4e-ef59eff1ad6e-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "ef3ba9cc-12c1-4e77-9d4e-ef59eff1ad6e" (UID: "ef3ba9cc-12c1-4e77-9d4e-ef59eff1ad6e"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:05:56 crc kubenswrapper[4685]: I0321 04:05:56.454161 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef3ba9cc-12c1-4e77-9d4e-ef59eff1ad6e-kube-api-access-l2f6l" (OuterVolumeSpecName: "kube-api-access-l2f6l") pod "ef3ba9cc-12c1-4e77-9d4e-ef59eff1ad6e" (UID: "ef3ba9cc-12c1-4e77-9d4e-ef59eff1ad6e"). InnerVolumeSpecName "kube-api-access-l2f6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:05:56 crc kubenswrapper[4685]: I0321 04:05:56.549353 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2f6l\" (UniqueName: \"kubernetes.io/projected/ef3ba9cc-12c1-4e77-9d4e-ef59eff1ad6e-kube-api-access-l2f6l\") on node \"crc\" DevicePath \"\"" Mar 21 04:05:56 crc kubenswrapper[4685]: I0321 04:05:56.549604 4685 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ef3ba9cc-12c1-4e77-9d4e-ef59eff1ad6e-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:05:56 crc kubenswrapper[4685]: I0321 04:05:56.979464 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-db-sync-jgnjz" event={"ID":"ef3ba9cc-12c1-4e77-9d4e-ef59eff1ad6e","Type":"ContainerDied","Data":"08cab9bebf6725001848795dbe34a2712d7630cb38f69bf9840867b8bf47d49d"} Mar 21 04:05:56 crc kubenswrapper[4685]: I0321 04:05:56.979509 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-db-sync-jgnjz" Mar 21 04:05:56 crc kubenswrapper[4685]: I0321 04:05:56.979523 4685 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="08cab9bebf6725001848795dbe34a2712d7630cb38f69bf9840867b8bf47d49d" Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.115354 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/barbican-worker-5d6fc64879-tmvxj"] Mar 21 04:05:57 crc kubenswrapper[4685]: E0321 04:05:57.115613 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef3ba9cc-12c1-4e77-9d4e-ef59eff1ad6e" containerName="barbican-db-sync" Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.115628 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef3ba9cc-12c1-4e77-9d4e-ef59eff1ad6e" containerName="barbican-db-sync" Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.115745 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef3ba9cc-12c1-4e77-9d4e-ef59eff1ad6e" containerName="barbican-db-sync" Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.116489 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-worker-5d6fc64879-tmvxj" Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.119892 4685 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"barbican-worker-config-data" Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.124147 4685 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"barbican-barbican-dockercfg-n2t7f" Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.124470 4685 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"barbican-config-data" Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.141385 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-worker-5d6fc64879-tmvxj"] Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.183477 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-566574dc7b-97wnd"] Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.184756 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-keystone-listener-566574dc7b-97wnd" Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.187582 4685 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"barbican-keystone-listener-config-data" Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.195622 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-566574dc7b-97wnd"] Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.260639 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hx8q8\" (UniqueName: \"kubernetes.io/projected/6b2406cc-b010-443b-89e6-2dc27034a38e-kube-api-access-hx8q8\") pod \"barbican-worker-5d6fc64879-tmvxj\" (UID: \"6b2406cc-b010-443b-89e6-2dc27034a38e\") " pod="barbican-kuttl-tests/barbican-worker-5d6fc64879-tmvxj" Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.260727 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6b2406cc-b010-443b-89e6-2dc27034a38e-config-data-custom\") pod \"barbican-worker-5d6fc64879-tmvxj\" (UID: \"6b2406cc-b010-443b-89e6-2dc27034a38e\") " pod="barbican-kuttl-tests/barbican-worker-5d6fc64879-tmvxj" Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.260764 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b2406cc-b010-443b-89e6-2dc27034a38e-config-data\") pod \"barbican-worker-5d6fc64879-tmvxj\" (UID: \"6b2406cc-b010-443b-89e6-2dc27034a38e\") " pod="barbican-kuttl-tests/barbican-worker-5d6fc64879-tmvxj" Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.260796 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b2406cc-b010-443b-89e6-2dc27034a38e-logs\") pod \"barbican-worker-5d6fc64879-tmvxj\" (UID: \"6b2406cc-b010-443b-89e6-2dc27034a38e\") " pod="barbican-kuttl-tests/barbican-worker-5d6fc64879-tmvxj" Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.306235 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/barbican-api-86df886d56-rnpb9"] Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.307175 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-api-86df886d56-rnpb9" Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.310346 4685 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"barbican-api-config-data" Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.320084 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-api-86df886d56-rnpb9"] Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.362254 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hx8q8\" (UniqueName: \"kubernetes.io/projected/6b2406cc-b010-443b-89e6-2dc27034a38e-kube-api-access-hx8q8\") pod \"barbican-worker-5d6fc64879-tmvxj\" (UID: \"6b2406cc-b010-443b-89e6-2dc27034a38e\") " pod="barbican-kuttl-tests/barbican-worker-5d6fc64879-tmvxj" Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.362324 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62f3656b-1837-4b86-a106-923137d04fcf-config-data\") pod \"barbican-keystone-listener-566574dc7b-97wnd\" (UID: \"62f3656b-1837-4b86-a106-923137d04fcf\") " pod="barbican-kuttl-tests/barbican-keystone-listener-566574dc7b-97wnd" Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.362363 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6b2406cc-b010-443b-89e6-2dc27034a38e-config-data-custom\") pod \"barbican-worker-5d6fc64879-tmvxj\" (UID: \"6b2406cc-b010-443b-89e6-2dc27034a38e\") " pod="barbican-kuttl-tests/barbican-worker-5d6fc64879-tmvxj" Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.362390 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62f3656b-1837-4b86-a106-923137d04fcf-logs\") pod \"barbican-keystone-listener-566574dc7b-97wnd\" (UID: \"62f3656b-1837-4b86-a106-923137d04fcf\") " pod="barbican-kuttl-tests/barbican-keystone-listener-566574dc7b-97wnd" Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.362776 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b2406cc-b010-443b-89e6-2dc27034a38e-config-data\") pod \"barbican-worker-5d6fc64879-tmvxj\" (UID: \"6b2406cc-b010-443b-89e6-2dc27034a38e\") " pod="barbican-kuttl-tests/barbican-worker-5d6fc64879-tmvxj" Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.362805 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b2406cc-b010-443b-89e6-2dc27034a38e-logs\") pod \"barbican-worker-5d6fc64879-tmvxj\" (UID: \"6b2406cc-b010-443b-89e6-2dc27034a38e\") " pod="barbican-kuttl-tests/barbican-worker-5d6fc64879-tmvxj" Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.362870 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/62f3656b-1837-4b86-a106-923137d04fcf-config-data-custom\") pod \"barbican-keystone-listener-566574dc7b-97wnd\" (UID: \"62f3656b-1837-4b86-a106-923137d04fcf\") " pod="barbican-kuttl-tests/barbican-keystone-listener-566574dc7b-97wnd" Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.362962 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7f6w9\" (UniqueName: \"kubernetes.io/projected/62f3656b-1837-4b86-a106-923137d04fcf-kube-api-access-7f6w9\") pod \"barbican-keystone-listener-566574dc7b-97wnd\" (UID: \"62f3656b-1837-4b86-a106-923137d04fcf\") " pod="barbican-kuttl-tests/barbican-keystone-listener-566574dc7b-97wnd" Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.363284 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b2406cc-b010-443b-89e6-2dc27034a38e-logs\") pod \"barbican-worker-5d6fc64879-tmvxj\" (UID: \"6b2406cc-b010-443b-89e6-2dc27034a38e\") " pod="barbican-kuttl-tests/barbican-worker-5d6fc64879-tmvxj" Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.374862 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6b2406cc-b010-443b-89e6-2dc27034a38e-config-data-custom\") pod \"barbican-worker-5d6fc64879-tmvxj\" (UID: \"6b2406cc-b010-443b-89e6-2dc27034a38e\") " pod="barbican-kuttl-tests/barbican-worker-5d6fc64879-tmvxj" Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.380067 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b2406cc-b010-443b-89e6-2dc27034a38e-config-data\") pod \"barbican-worker-5d6fc64879-tmvxj\" (UID: \"6b2406cc-b010-443b-89e6-2dc27034a38e\") " pod="barbican-kuttl-tests/barbican-worker-5d6fc64879-tmvxj" Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.405468 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hx8q8\" (UniqueName: \"kubernetes.io/projected/6b2406cc-b010-443b-89e6-2dc27034a38e-kube-api-access-hx8q8\") pod \"barbican-worker-5d6fc64879-tmvxj\" (UID: \"6b2406cc-b010-443b-89e6-2dc27034a38e\") " pod="barbican-kuttl-tests/barbican-worker-5d6fc64879-tmvxj" Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.439517 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-worker-5d6fc64879-tmvxj" Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.464861 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62f3656b-1837-4b86-a106-923137d04fcf-config-data\") pod \"barbican-keystone-listener-566574dc7b-97wnd\" (UID: \"62f3656b-1837-4b86-a106-923137d04fcf\") " pod="barbican-kuttl-tests/barbican-keystone-listener-566574dc7b-97wnd" Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.464934 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53cb4c49-8904-4b10-a275-2cbe3ca4257a-config-data\") pod \"barbican-api-86df886d56-rnpb9\" (UID: \"53cb4c49-8904-4b10-a275-2cbe3ca4257a\") " pod="barbican-kuttl-tests/barbican-api-86df886d56-rnpb9" Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.464959 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62f3656b-1837-4b86-a106-923137d04fcf-logs\") pod \"barbican-keystone-listener-566574dc7b-97wnd\" (UID: \"62f3656b-1837-4b86-a106-923137d04fcf\") " pod="barbican-kuttl-tests/barbican-keystone-listener-566574dc7b-97wnd" Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.465001 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jn97n\" (UniqueName: \"kubernetes.io/projected/53cb4c49-8904-4b10-a275-2cbe3ca4257a-kube-api-access-jn97n\") pod \"barbican-api-86df886d56-rnpb9\" (UID: \"53cb4c49-8904-4b10-a275-2cbe3ca4257a\") " pod="barbican-kuttl-tests/barbican-api-86df886d56-rnpb9" Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.465033 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/62f3656b-1837-4b86-a106-923137d04fcf-config-data-custom\") pod \"barbican-keystone-listener-566574dc7b-97wnd\" (UID: \"62f3656b-1837-4b86-a106-923137d04fcf\") " pod="barbican-kuttl-tests/barbican-keystone-listener-566574dc7b-97wnd" Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.465050 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7f6w9\" (UniqueName: \"kubernetes.io/projected/62f3656b-1837-4b86-a106-923137d04fcf-kube-api-access-7f6w9\") pod \"barbican-keystone-listener-566574dc7b-97wnd\" (UID: \"62f3656b-1837-4b86-a106-923137d04fcf\") " pod="barbican-kuttl-tests/barbican-keystone-listener-566574dc7b-97wnd" Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.465081 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53cb4c49-8904-4b10-a275-2cbe3ca4257a-logs\") pod \"barbican-api-86df886d56-rnpb9\" (UID: \"53cb4c49-8904-4b10-a275-2cbe3ca4257a\") " pod="barbican-kuttl-tests/barbican-api-86df886d56-rnpb9" Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.465129 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/53cb4c49-8904-4b10-a275-2cbe3ca4257a-config-data-custom\") pod \"barbican-api-86df886d56-rnpb9\" (UID: \"53cb4c49-8904-4b10-a275-2cbe3ca4257a\") " pod="barbican-kuttl-tests/barbican-api-86df886d56-rnpb9" Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.467404 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62f3656b-1837-4b86-a106-923137d04fcf-logs\") pod \"barbican-keystone-listener-566574dc7b-97wnd\" (UID: \"62f3656b-1837-4b86-a106-923137d04fcf\") " pod="barbican-kuttl-tests/barbican-keystone-listener-566574dc7b-97wnd" Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.476014 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62f3656b-1837-4b86-a106-923137d04fcf-config-data\") pod \"barbican-keystone-listener-566574dc7b-97wnd\" (UID: \"62f3656b-1837-4b86-a106-923137d04fcf\") " pod="barbican-kuttl-tests/barbican-keystone-listener-566574dc7b-97wnd" Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.479048 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/62f3656b-1837-4b86-a106-923137d04fcf-config-data-custom\") pod \"barbican-keystone-listener-566574dc7b-97wnd\" (UID: \"62f3656b-1837-4b86-a106-923137d04fcf\") " pod="barbican-kuttl-tests/barbican-keystone-listener-566574dc7b-97wnd" Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.487041 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7f6w9\" (UniqueName: \"kubernetes.io/projected/62f3656b-1837-4b86-a106-923137d04fcf-kube-api-access-7f6w9\") pod \"barbican-keystone-listener-566574dc7b-97wnd\" (UID: \"62f3656b-1837-4b86-a106-923137d04fcf\") " pod="barbican-kuttl-tests/barbican-keystone-listener-566574dc7b-97wnd" Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.497528 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-keystone-listener-566574dc7b-97wnd" Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.566496 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/53cb4c49-8904-4b10-a275-2cbe3ca4257a-config-data-custom\") pod \"barbican-api-86df886d56-rnpb9\" (UID: \"53cb4c49-8904-4b10-a275-2cbe3ca4257a\") " pod="barbican-kuttl-tests/barbican-api-86df886d56-rnpb9" Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.566560 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53cb4c49-8904-4b10-a275-2cbe3ca4257a-config-data\") pod \"barbican-api-86df886d56-rnpb9\" (UID: \"53cb4c49-8904-4b10-a275-2cbe3ca4257a\") " pod="barbican-kuttl-tests/barbican-api-86df886d56-rnpb9" Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.566600 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jn97n\" (UniqueName: \"kubernetes.io/projected/53cb4c49-8904-4b10-a275-2cbe3ca4257a-kube-api-access-jn97n\") pod \"barbican-api-86df886d56-rnpb9\" (UID: \"53cb4c49-8904-4b10-a275-2cbe3ca4257a\") " pod="barbican-kuttl-tests/barbican-api-86df886d56-rnpb9" Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.566629 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53cb4c49-8904-4b10-a275-2cbe3ca4257a-logs\") pod \"barbican-api-86df886d56-rnpb9\" (UID: \"53cb4c49-8904-4b10-a275-2cbe3ca4257a\") " pod="barbican-kuttl-tests/barbican-api-86df886d56-rnpb9" Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.567289 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53cb4c49-8904-4b10-a275-2cbe3ca4257a-logs\") pod \"barbican-api-86df886d56-rnpb9\" (UID: \"53cb4c49-8904-4b10-a275-2cbe3ca4257a\") " pod="barbican-kuttl-tests/barbican-api-86df886d56-rnpb9" Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.573621 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/53cb4c49-8904-4b10-a275-2cbe3ca4257a-config-data-custom\") pod \"barbican-api-86df886d56-rnpb9\" (UID: \"53cb4c49-8904-4b10-a275-2cbe3ca4257a\") " pod="barbican-kuttl-tests/barbican-api-86df886d56-rnpb9" Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.576050 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53cb4c49-8904-4b10-a275-2cbe3ca4257a-config-data\") pod \"barbican-api-86df886d56-rnpb9\" (UID: \"53cb4c49-8904-4b10-a275-2cbe3ca4257a\") " pod="barbican-kuttl-tests/barbican-api-86df886d56-rnpb9" Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.591509 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/barbican-api-86df886d56-cjzpf"] Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.592778 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-api-86df886d56-cjzpf" Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.607039 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jn97n\" (UniqueName: \"kubernetes.io/projected/53cb4c49-8904-4b10-a275-2cbe3ca4257a-kube-api-access-jn97n\") pod \"barbican-api-86df886d56-rnpb9\" (UID: \"53cb4c49-8904-4b10-a275-2cbe3ca4257a\") " pod="barbican-kuttl-tests/barbican-api-86df886d56-rnpb9" Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.614752 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-api-86df886d56-cjzpf"] Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.626561 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-api-86df886d56-rnpb9" Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.745543 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-566574dc7b-8ppvx"] Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.746591 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-keystone-listener-566574dc7b-8ppvx" Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.755462 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-566574dc7b-8ppvx"] Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.769800 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4cee7f36-cab3-42c7-a2d2-f3ffc81d85cd-config-data-custom\") pod \"barbican-api-86df886d56-cjzpf\" (UID: \"4cee7f36-cab3-42c7-a2d2-f3ffc81d85cd\") " pod="barbican-kuttl-tests/barbican-api-86df886d56-cjzpf" Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.769909 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cee7f36-cab3-42c7-a2d2-f3ffc81d85cd-config-data\") pod \"barbican-api-86df886d56-cjzpf\" (UID: \"4cee7f36-cab3-42c7-a2d2-f3ffc81d85cd\") " pod="barbican-kuttl-tests/barbican-api-86df886d56-cjzpf" Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.769933 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlh7z\" (UniqueName: \"kubernetes.io/projected/4cee7f36-cab3-42c7-a2d2-f3ffc81d85cd-kube-api-access-jlh7z\") pod \"barbican-api-86df886d56-cjzpf\" (UID: \"4cee7f36-cab3-42c7-a2d2-f3ffc81d85cd\") " pod="barbican-kuttl-tests/barbican-api-86df886d56-cjzpf" Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.769960 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4cee7f36-cab3-42c7-a2d2-f3ffc81d85cd-logs\") pod \"barbican-api-86df886d56-cjzpf\" (UID: \"4cee7f36-cab3-42c7-a2d2-f3ffc81d85cd\") " pod="barbican-kuttl-tests/barbican-api-86df886d56-cjzpf" Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.820112 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-566574dc7b-97wnd"] Mar 21 04:05:57 crc kubenswrapper[4685]: W0321 04:05:57.824507 4685 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod62f3656b_1837_4b86_a106_923137d04fcf.slice/crio-99000af8a82f257571786a79f79231ac50b81f8ce1f0323aac62ccf858ff1a6d WatchSource:0}: Error finding container 99000af8a82f257571786a79f79231ac50b81f8ce1f0323aac62ccf858ff1a6d: Status 404 returned error can't find the container with id 99000af8a82f257571786a79f79231ac50b81f8ce1f0323aac62ccf858ff1a6d Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.871703 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf854e07-f0f9-4160-9dbc-ccda00f50a21-logs\") pod \"barbican-keystone-listener-566574dc7b-8ppvx\" (UID: \"cf854e07-f0f9-4160-9dbc-ccda00f50a21\") " pod="barbican-kuttl-tests/barbican-keystone-listener-566574dc7b-8ppvx" Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.871781 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cee7f36-cab3-42c7-a2d2-f3ffc81d85cd-config-data\") pod \"barbican-api-86df886d56-cjzpf\" (UID: \"4cee7f36-cab3-42c7-a2d2-f3ffc81d85cd\") " pod="barbican-kuttl-tests/barbican-api-86df886d56-cjzpf" Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.871810 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlh7z\" (UniqueName: \"kubernetes.io/projected/4cee7f36-cab3-42c7-a2d2-f3ffc81d85cd-kube-api-access-jlh7z\") pod \"barbican-api-86df886d56-cjzpf\" (UID: \"4cee7f36-cab3-42c7-a2d2-f3ffc81d85cd\") " pod="barbican-kuttl-tests/barbican-api-86df886d56-cjzpf" Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.871829 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf854e07-f0f9-4160-9dbc-ccda00f50a21-config-data\") pod \"barbican-keystone-listener-566574dc7b-8ppvx\" (UID: \"cf854e07-f0f9-4160-9dbc-ccda00f50a21\") " pod="barbican-kuttl-tests/barbican-keystone-listener-566574dc7b-8ppvx" Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.871878 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4cee7f36-cab3-42c7-a2d2-f3ffc81d85cd-logs\") pod \"barbican-api-86df886d56-cjzpf\" (UID: \"4cee7f36-cab3-42c7-a2d2-f3ffc81d85cd\") " pod="barbican-kuttl-tests/barbican-api-86df886d56-cjzpf" Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.871911 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcpbg\" (UniqueName: \"kubernetes.io/projected/cf854e07-f0f9-4160-9dbc-ccda00f50a21-kube-api-access-jcpbg\") pod \"barbican-keystone-listener-566574dc7b-8ppvx\" (UID: \"cf854e07-f0f9-4160-9dbc-ccda00f50a21\") " pod="barbican-kuttl-tests/barbican-keystone-listener-566574dc7b-8ppvx" Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.872342 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4cee7f36-cab3-42c7-a2d2-f3ffc81d85cd-logs\") pod \"barbican-api-86df886d56-cjzpf\" (UID: \"4cee7f36-cab3-42c7-a2d2-f3ffc81d85cd\") " pod="barbican-kuttl-tests/barbican-api-86df886d56-cjzpf" Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.872433 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4cee7f36-cab3-42c7-a2d2-f3ffc81d85cd-config-data-custom\") pod \"barbican-api-86df886d56-cjzpf\" (UID: \"4cee7f36-cab3-42c7-a2d2-f3ffc81d85cd\") " pod="barbican-kuttl-tests/barbican-api-86df886d56-cjzpf" Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.872460 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cf854e07-f0f9-4160-9dbc-ccda00f50a21-config-data-custom\") pod \"barbican-keystone-listener-566574dc7b-8ppvx\" (UID: \"cf854e07-f0f9-4160-9dbc-ccda00f50a21\") " pod="barbican-kuttl-tests/barbican-keystone-listener-566574dc7b-8ppvx" Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.875272 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cee7f36-cab3-42c7-a2d2-f3ffc81d85cd-config-data\") pod \"barbican-api-86df886d56-cjzpf\" (UID: \"4cee7f36-cab3-42c7-a2d2-f3ffc81d85cd\") " pod="barbican-kuttl-tests/barbican-api-86df886d56-cjzpf" Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.875617 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4cee7f36-cab3-42c7-a2d2-f3ffc81d85cd-config-data-custom\") pod \"barbican-api-86df886d56-cjzpf\" (UID: \"4cee7f36-cab3-42c7-a2d2-f3ffc81d85cd\") " pod="barbican-kuttl-tests/barbican-api-86df886d56-cjzpf" Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.889685 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlh7z\" (UniqueName: \"kubernetes.io/projected/4cee7f36-cab3-42c7-a2d2-f3ffc81d85cd-kube-api-access-jlh7z\") pod \"barbican-api-86df886d56-cjzpf\" (UID: \"4cee7f36-cab3-42c7-a2d2-f3ffc81d85cd\") " pod="barbican-kuttl-tests/barbican-api-86df886d56-cjzpf" Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.926156 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-api-86df886d56-cjzpf" Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.928879 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/barbican-worker-5d6fc64879-r4p2z"] Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.935564 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-worker-5d6fc64879-r4p2z" Mar 21 04:05:57 crc kubenswrapper[4685]: W0321 04:05:57.944923 4685 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6b2406cc_b010_443b_89e6_2dc27034a38e.slice/crio-5b1b60452566f218c1a4bd28c85f108d5a2671d504f81a63bcba22d4bd723095 WatchSource:0}: Error finding container 5b1b60452566f218c1a4bd28c85f108d5a2671d504f81a63bcba22d4bd723095: Status 404 returned error can't find the container with id 5b1b60452566f218c1a4bd28c85f108d5a2671d504f81a63bcba22d4bd723095 Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.955216 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-worker-5d6fc64879-tmvxj"] Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.973500 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcpbg\" (UniqueName: \"kubernetes.io/projected/cf854e07-f0f9-4160-9dbc-ccda00f50a21-kube-api-access-jcpbg\") pod \"barbican-keystone-listener-566574dc7b-8ppvx\" (UID: \"cf854e07-f0f9-4160-9dbc-ccda00f50a21\") " pod="barbican-kuttl-tests/barbican-keystone-listener-566574dc7b-8ppvx" Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.973568 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cf854e07-f0f9-4160-9dbc-ccda00f50a21-config-data-custom\") pod \"barbican-keystone-listener-566574dc7b-8ppvx\" (UID: \"cf854e07-f0f9-4160-9dbc-ccda00f50a21\") " pod="barbican-kuttl-tests/barbican-keystone-listener-566574dc7b-8ppvx" Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.973600 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf854e07-f0f9-4160-9dbc-ccda00f50a21-logs\") pod \"barbican-keystone-listener-566574dc7b-8ppvx\" (UID: \"cf854e07-f0f9-4160-9dbc-ccda00f50a21\") " pod="barbican-kuttl-tests/barbican-keystone-listener-566574dc7b-8ppvx" Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.973722 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf854e07-f0f9-4160-9dbc-ccda00f50a21-config-data\") pod \"barbican-keystone-listener-566574dc7b-8ppvx\" (UID: \"cf854e07-f0f9-4160-9dbc-ccda00f50a21\") " pod="barbican-kuttl-tests/barbican-keystone-listener-566574dc7b-8ppvx" Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.974749 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf854e07-f0f9-4160-9dbc-ccda00f50a21-logs\") pod \"barbican-keystone-listener-566574dc7b-8ppvx\" (UID: \"cf854e07-f0f9-4160-9dbc-ccda00f50a21\") " pod="barbican-kuttl-tests/barbican-keystone-listener-566574dc7b-8ppvx" Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.977328 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cf854e07-f0f9-4160-9dbc-ccda00f50a21-config-data-custom\") pod \"barbican-keystone-listener-566574dc7b-8ppvx\" (UID: \"cf854e07-f0f9-4160-9dbc-ccda00f50a21\") " pod="barbican-kuttl-tests/barbican-keystone-listener-566574dc7b-8ppvx" Mar 21 04:05:57 crc kubenswrapper[4685]: W0321 04:05:57.977406 4685 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod53cb4c49_8904_4b10_a275_2cbe3ca4257a.slice/crio-05316b1a67a8c6432577a0b04c2ff6317d0048d6367d5a463e8116c887d7002d WatchSource:0}: Error finding container 05316b1a67a8c6432577a0b04c2ff6317d0048d6367d5a463e8116c887d7002d: Status 404 returned error can't find the container with id 05316b1a67a8c6432577a0b04c2ff6317d0048d6367d5a463e8116c887d7002d Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.979251 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-worker-5d6fc64879-r4p2z"] Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.984592 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf854e07-f0f9-4160-9dbc-ccda00f50a21-config-data\") pod \"barbican-keystone-listener-566574dc7b-8ppvx\" (UID: \"cf854e07-f0f9-4160-9dbc-ccda00f50a21\") " pod="barbican-kuttl-tests/barbican-keystone-listener-566574dc7b-8ppvx" Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.986278 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-api-86df886d56-rnpb9"] Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.986390 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-5d6fc64879-tmvxj" event={"ID":"6b2406cc-b010-443b-89e6-2dc27034a38e","Type":"ContainerStarted","Data":"5b1b60452566f218c1a4bd28c85f108d5a2671d504f81a63bcba22d4bd723095"} Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.987568 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-86df886d56-rnpb9" event={"ID":"53cb4c49-8904-4b10-a275-2cbe3ca4257a","Type":"ContainerStarted","Data":"05316b1a67a8c6432577a0b04c2ff6317d0048d6367d5a463e8116c887d7002d"} Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.988909 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-566574dc7b-97wnd" event={"ID":"62f3656b-1837-4b86-a106-923137d04fcf","Type":"ContainerStarted","Data":"99000af8a82f257571786a79f79231ac50b81f8ce1f0323aac62ccf858ff1a6d"} Mar 21 04:05:57 crc kubenswrapper[4685]: I0321 04:05:57.990389 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcpbg\" (UniqueName: \"kubernetes.io/projected/cf854e07-f0f9-4160-9dbc-ccda00f50a21-kube-api-access-jcpbg\") pod \"barbican-keystone-listener-566574dc7b-8ppvx\" (UID: \"cf854e07-f0f9-4160-9dbc-ccda00f50a21\") " pod="barbican-kuttl-tests/barbican-keystone-listener-566574dc7b-8ppvx" Mar 21 04:05:58 crc kubenswrapper[4685]: I0321 04:05:58.066720 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-keystone-listener-566574dc7b-8ppvx" Mar 21 04:05:58 crc kubenswrapper[4685]: I0321 04:05:58.074798 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c05bf573-3e4a-4bee-8638-61b2c36dce22-config-data\") pod \"barbican-worker-5d6fc64879-r4p2z\" (UID: \"c05bf573-3e4a-4bee-8638-61b2c36dce22\") " pod="barbican-kuttl-tests/barbican-worker-5d6fc64879-r4p2z" Mar 21 04:05:58 crc kubenswrapper[4685]: I0321 04:05:58.074908 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c05bf573-3e4a-4bee-8638-61b2c36dce22-logs\") pod \"barbican-worker-5d6fc64879-r4p2z\" (UID: \"c05bf573-3e4a-4bee-8638-61b2c36dce22\") " pod="barbican-kuttl-tests/barbican-worker-5d6fc64879-r4p2z" Mar 21 04:05:58 crc kubenswrapper[4685]: I0321 04:05:58.074943 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c05bf573-3e4a-4bee-8638-61b2c36dce22-config-data-custom\") pod \"barbican-worker-5d6fc64879-r4p2z\" (UID: \"c05bf573-3e4a-4bee-8638-61b2c36dce22\") " pod="barbican-kuttl-tests/barbican-worker-5d6fc64879-r4p2z" Mar 21 04:05:58 crc kubenswrapper[4685]: I0321 04:05:58.074969 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k465r\" (UniqueName: \"kubernetes.io/projected/c05bf573-3e4a-4bee-8638-61b2c36dce22-kube-api-access-k465r\") pod \"barbican-worker-5d6fc64879-r4p2z\" (UID: \"c05bf573-3e4a-4bee-8638-61b2c36dce22\") " pod="barbican-kuttl-tests/barbican-worker-5d6fc64879-r4p2z" Mar 21 04:05:58 crc kubenswrapper[4685]: I0321 04:05:58.153623 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-api-86df886d56-cjzpf"] Mar 21 04:05:58 crc kubenswrapper[4685]: W0321 04:05:58.170949 4685 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4cee7f36_cab3_42c7_a2d2_f3ffc81d85cd.slice/crio-f9be55e5f7ace865932b5e0ee184451b2dc3af24db16ddab9c603364a319eb4c WatchSource:0}: Error finding container f9be55e5f7ace865932b5e0ee184451b2dc3af24db16ddab9c603364a319eb4c: Status 404 returned error can't find the container with id f9be55e5f7ace865932b5e0ee184451b2dc3af24db16ddab9c603364a319eb4c Mar 21 04:05:58 crc kubenswrapper[4685]: I0321 04:05:58.177710 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c05bf573-3e4a-4bee-8638-61b2c36dce22-logs\") pod \"barbican-worker-5d6fc64879-r4p2z\" (UID: \"c05bf573-3e4a-4bee-8638-61b2c36dce22\") " pod="barbican-kuttl-tests/barbican-worker-5d6fc64879-r4p2z" Mar 21 04:05:58 crc kubenswrapper[4685]: I0321 04:05:58.177760 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c05bf573-3e4a-4bee-8638-61b2c36dce22-config-data-custom\") pod \"barbican-worker-5d6fc64879-r4p2z\" (UID: \"c05bf573-3e4a-4bee-8638-61b2c36dce22\") " pod="barbican-kuttl-tests/barbican-worker-5d6fc64879-r4p2z" Mar 21 04:05:58 crc kubenswrapper[4685]: I0321 04:05:58.177788 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k465r\" (UniqueName: \"kubernetes.io/projected/c05bf573-3e4a-4bee-8638-61b2c36dce22-kube-api-access-k465r\") pod \"barbican-worker-5d6fc64879-r4p2z\" (UID: \"c05bf573-3e4a-4bee-8638-61b2c36dce22\") " pod="barbican-kuttl-tests/barbican-worker-5d6fc64879-r4p2z" Mar 21 04:05:58 crc kubenswrapper[4685]: I0321 04:05:58.177880 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c05bf573-3e4a-4bee-8638-61b2c36dce22-config-data\") pod \"barbican-worker-5d6fc64879-r4p2z\" (UID: \"c05bf573-3e4a-4bee-8638-61b2c36dce22\") " pod="barbican-kuttl-tests/barbican-worker-5d6fc64879-r4p2z" Mar 21 04:05:58 crc kubenswrapper[4685]: I0321 04:05:58.178492 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c05bf573-3e4a-4bee-8638-61b2c36dce22-logs\") pod \"barbican-worker-5d6fc64879-r4p2z\" (UID: \"c05bf573-3e4a-4bee-8638-61b2c36dce22\") " pod="barbican-kuttl-tests/barbican-worker-5d6fc64879-r4p2z" Mar 21 04:05:58 crc kubenswrapper[4685]: I0321 04:05:58.186511 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c05bf573-3e4a-4bee-8638-61b2c36dce22-config-data\") pod \"barbican-worker-5d6fc64879-r4p2z\" (UID: \"c05bf573-3e4a-4bee-8638-61b2c36dce22\") " pod="barbican-kuttl-tests/barbican-worker-5d6fc64879-r4p2z" Mar 21 04:05:58 crc kubenswrapper[4685]: I0321 04:05:58.193508 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c05bf573-3e4a-4bee-8638-61b2c36dce22-config-data-custom\") pod \"barbican-worker-5d6fc64879-r4p2z\" (UID: \"c05bf573-3e4a-4bee-8638-61b2c36dce22\") " pod="barbican-kuttl-tests/barbican-worker-5d6fc64879-r4p2z" Mar 21 04:05:58 crc kubenswrapper[4685]: I0321 04:05:58.201735 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k465r\" (UniqueName: \"kubernetes.io/projected/c05bf573-3e4a-4bee-8638-61b2c36dce22-kube-api-access-k465r\") pod \"barbican-worker-5d6fc64879-r4p2z\" (UID: \"c05bf573-3e4a-4bee-8638-61b2c36dce22\") " pod="barbican-kuttl-tests/barbican-worker-5d6fc64879-r4p2z" Mar 21 04:05:58 crc kubenswrapper[4685]: I0321 04:05:58.273774 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-worker-5d6fc64879-r4p2z" Mar 21 04:05:58 crc kubenswrapper[4685]: I0321 04:05:58.523624 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-566574dc7b-8ppvx"] Mar 21 04:05:58 crc kubenswrapper[4685]: I0321 04:05:58.800805 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-worker-5d6fc64879-r4p2z"] Mar 21 04:05:58 crc kubenswrapper[4685]: W0321 04:05:58.843634 4685 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc05bf573_3e4a_4bee_8638_61b2c36dce22.slice/crio-16957e5ebcc5faef4c58614487c49c85ac87689f56b8e12f7fe7fd76bc4933ad WatchSource:0}: Error finding container 16957e5ebcc5faef4c58614487c49c85ac87689f56b8e12f7fe7fd76bc4933ad: Status 404 returned error can't find the container with id 16957e5ebcc5faef4c58614487c49c85ac87689f56b8e12f7fe7fd76bc4933ad Mar 21 04:05:58 crc kubenswrapper[4685]: I0321 04:05:58.996816 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-566574dc7b-8ppvx" event={"ID":"cf854e07-f0f9-4160-9dbc-ccda00f50a21","Type":"ContainerStarted","Data":"7c98da6f2f706689474ba49a9508852a237f7aacfb5691d14df4db3b1746384a"} Mar 21 04:05:58 crc kubenswrapper[4685]: I0321 04:05:58.998180 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-5d6fc64879-r4p2z" event={"ID":"c05bf573-3e4a-4bee-8638-61b2c36dce22","Type":"ContainerStarted","Data":"16957e5ebcc5faef4c58614487c49c85ac87689f56b8e12f7fe7fd76bc4933ad"} Mar 21 04:05:59 crc kubenswrapper[4685]: I0321 04:05:59.000630 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-86df886d56-cjzpf" event={"ID":"4cee7f36-cab3-42c7-a2d2-f3ffc81d85cd","Type":"ContainerStarted","Data":"41cfd3811fe1482b9e8bab18a7d2f690647e35903683e88a424d17c6ffc124a6"} Mar 21 04:05:59 crc kubenswrapper[4685]: I0321 04:05:59.000672 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-86df886d56-cjzpf" event={"ID":"4cee7f36-cab3-42c7-a2d2-f3ffc81d85cd","Type":"ContainerStarted","Data":"511b55c99ad676623d4a4999ffd85f8233405261ac93dfd36569eca36a87ce1b"} Mar 21 04:05:59 crc kubenswrapper[4685]: I0321 04:05:59.000682 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-86df886d56-cjzpf" event={"ID":"4cee7f36-cab3-42c7-a2d2-f3ffc81d85cd","Type":"ContainerStarted","Data":"f9be55e5f7ace865932b5e0ee184451b2dc3af24db16ddab9c603364a319eb4c"} Mar 21 04:05:59 crc kubenswrapper[4685]: I0321 04:05:59.001109 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="barbican-kuttl-tests/barbican-api-86df886d56-cjzpf" Mar 21 04:05:59 crc kubenswrapper[4685]: I0321 04:05:59.001140 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="barbican-kuttl-tests/barbican-api-86df886d56-cjzpf" Mar 21 04:05:59 crc kubenswrapper[4685]: I0321 04:05:59.003188 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-86df886d56-rnpb9" event={"ID":"53cb4c49-8904-4b10-a275-2cbe3ca4257a","Type":"ContainerStarted","Data":"a3f02f09bde54ff7a78bc8cbe1b31f76a02c72b6f30df8ce56db02499946cfe4"} Mar 21 04:05:59 crc kubenswrapper[4685]: I0321 04:05:59.003233 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-86df886d56-rnpb9" event={"ID":"53cb4c49-8904-4b10-a275-2cbe3ca4257a","Type":"ContainerStarted","Data":"0a0e12b3ad93c3db08827cb7032497442e7154b41ec6a1849798c184aa3185de"} Mar 21 04:05:59 crc kubenswrapper[4685]: I0321 04:05:59.003382 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="barbican-kuttl-tests/barbican-api-86df886d56-rnpb9" Mar 21 04:05:59 crc kubenswrapper[4685]: I0321 04:05:59.003408 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="barbican-kuttl-tests/barbican-api-86df886d56-rnpb9" Mar 21 04:05:59 crc kubenswrapper[4685]: I0321 04:05:59.015902 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/barbican-api-86df886d56-cjzpf" podStartSLOduration=2.015883462 podStartE2EDuration="2.015883462s" podCreationTimestamp="2026-03-21 04:05:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:05:59.015355018 +0000 UTC m=+1191.492423810" watchObservedRunningTime="2026-03-21 04:05:59.015883462 +0000 UTC m=+1191.492952254" Mar 21 04:05:59 crc kubenswrapper[4685]: I0321 04:05:59.032593 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/barbican-api-86df886d56-rnpb9" podStartSLOduration=2.03257373 podStartE2EDuration="2.03257373s" podCreationTimestamp="2026-03-21 04:05:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:05:59.0297256 +0000 UTC m=+1191.506794402" watchObservedRunningTime="2026-03-21 04:05:59.03257373 +0000 UTC m=+1191.509642522" Mar 21 04:05:59 crc kubenswrapper[4685]: I0321 04:05:59.102715 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-api-86df886d56-cjzpf"] Mar 21 04:05:59 crc kubenswrapper[4685]: I0321 04:05:59.305614 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-566574dc7b-8ppvx"] Mar 21 04:05:59 crc kubenswrapper[4685]: I0321 04:05:59.491659 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-worker-5d6fc64879-r4p2z"] Mar 21 04:06:00 crc kubenswrapper[4685]: I0321 04:06:00.136959 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567766-lgcgr"] Mar 21 04:06:00 crc kubenswrapper[4685]: I0321 04:06:00.138832 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567766-lgcgr" Mar 21 04:06:00 crc kubenswrapper[4685]: I0321 04:06:00.140881 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k75cc" Mar 21 04:06:00 crc kubenswrapper[4685]: I0321 04:06:00.142242 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 04:06:00 crc kubenswrapper[4685]: I0321 04:06:00.142416 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 04:06:00 crc kubenswrapper[4685]: I0321 04:06:00.154784 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567766-lgcgr"] Mar 21 04:06:00 crc kubenswrapper[4685]: I0321 04:06:00.324830 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdbsj\" (UniqueName: \"kubernetes.io/projected/7bbb4a07-790d-41ce-a9d0-40dd729bba6e-kube-api-access-jdbsj\") pod \"auto-csr-approver-29567766-lgcgr\" (UID: \"7bbb4a07-790d-41ce-a9d0-40dd729bba6e\") " pod="openshift-infra/auto-csr-approver-29567766-lgcgr" Mar 21 04:06:00 crc kubenswrapper[4685]: I0321 04:06:00.426362 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdbsj\" (UniqueName: \"kubernetes.io/projected/7bbb4a07-790d-41ce-a9d0-40dd729bba6e-kube-api-access-jdbsj\") pod \"auto-csr-approver-29567766-lgcgr\" (UID: \"7bbb4a07-790d-41ce-a9d0-40dd729bba6e\") " pod="openshift-infra/auto-csr-approver-29567766-lgcgr" Mar 21 04:06:00 crc kubenswrapper[4685]: I0321 04:06:00.448359 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdbsj\" (UniqueName: \"kubernetes.io/projected/7bbb4a07-790d-41ce-a9d0-40dd729bba6e-kube-api-access-jdbsj\") pod \"auto-csr-approver-29567766-lgcgr\" (UID: \"7bbb4a07-790d-41ce-a9d0-40dd729bba6e\") " pod="openshift-infra/auto-csr-approver-29567766-lgcgr" Mar 21 04:06:00 crc kubenswrapper[4685]: I0321 04:06:00.497353 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567766-lgcgr" Mar 21 04:06:00 crc kubenswrapper[4685]: I0321 04:06:00.607464 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-api-86df886d56-rnpb9"] Mar 21 04:06:00 crc kubenswrapper[4685]: I0321 04:06:00.820671 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-566574dc7b-97wnd"] Mar 21 04:06:00 crc kubenswrapper[4685]: I0321 04:06:00.989256 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567766-lgcgr"] Mar 21 04:06:01 crc kubenswrapper[4685]: I0321 04:06:01.024746 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567766-lgcgr" event={"ID":"7bbb4a07-790d-41ce-a9d0-40dd729bba6e","Type":"ContainerStarted","Data":"aba37e6f136d88c61cb83b698a378c784e456c5b4a0aa040399e365460f4a028"} Mar 21 04:06:01 crc kubenswrapper[4685]: I0321 04:06:01.026051 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-worker-5d6fc64879-tmvxj"] Mar 21 04:06:01 crc kubenswrapper[4685]: I0321 04:06:01.026953 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-5d6fc64879-tmvxj" event={"ID":"6b2406cc-b010-443b-89e6-2dc27034a38e","Type":"ContainerStarted","Data":"f0704f16170a637c23b471ddea3dc2599fa06ced5e76f4dbff23493f106d22c0"} Mar 21 04:06:01 crc kubenswrapper[4685]: I0321 04:06:01.027006 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-5d6fc64879-tmvxj" event={"ID":"6b2406cc-b010-443b-89e6-2dc27034a38e","Type":"ContainerStarted","Data":"eb7c2fd6ed1b34e95369b0cbf54f469189e85cc26fdf1d690d3b5b1575914134"} Mar 21 04:06:01 crc kubenswrapper[4685]: I0321 04:06:01.030567 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-566574dc7b-8ppvx" event={"ID":"cf854e07-f0f9-4160-9dbc-ccda00f50a21","Type":"ContainerStarted","Data":"bd97aea7a640f3a4b7ab19871ef72a957097598c0d7085456eef047123523924"} Mar 21 04:06:01 crc kubenswrapper[4685]: I0321 04:06:01.030621 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-566574dc7b-8ppvx" event={"ID":"cf854e07-f0f9-4160-9dbc-ccda00f50a21","Type":"ContainerStarted","Data":"3635458b4890450e0c9674ed9954b74779c9f2f2247462d1d69aa7931c450202"} Mar 21 04:06:01 crc kubenswrapper[4685]: I0321 04:06:01.030628 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-keystone-listener-566574dc7b-8ppvx" podUID="cf854e07-f0f9-4160-9dbc-ccda00f50a21" containerName="barbican-keystone-listener-log" containerID="cri-o://3635458b4890450e0c9674ed9954b74779c9f2f2247462d1d69aa7931c450202" gracePeriod=30 Mar 21 04:06:01 crc kubenswrapper[4685]: I0321 04:06:01.030674 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-keystone-listener-566574dc7b-8ppvx" podUID="cf854e07-f0f9-4160-9dbc-ccda00f50a21" containerName="barbican-keystone-listener" containerID="cri-o://bd97aea7a640f3a4b7ab19871ef72a957097598c0d7085456eef047123523924" gracePeriod=30 Mar 21 04:06:01 crc kubenswrapper[4685]: I0321 04:06:01.043428 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-566574dc7b-97wnd" event={"ID":"62f3656b-1837-4b86-a106-923137d04fcf","Type":"ContainerStarted","Data":"81a3c54be7bf2f00c15c57402ff5a31d220690fe6a9989ad300ffd170fb8d001"} Mar 21 04:06:01 crc kubenswrapper[4685]: I0321 04:06:01.043507 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-566574dc7b-97wnd" event={"ID":"62f3656b-1837-4b86-a106-923137d04fcf","Type":"ContainerStarted","Data":"229530ae4cfdea08483f46a407d1f32e4c9cc897d4da146322b58ee6cc87b2c4"} Mar 21 04:06:01 crc kubenswrapper[4685]: I0321 04:06:01.045373 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-5d6fc64879-r4p2z" event={"ID":"c05bf573-3e4a-4bee-8638-61b2c36dce22","Type":"ContainerStarted","Data":"13ec1b041ce06de426f7f70f380db4dd4f8178435b8c10695fc72aeff679a45f"} Mar 21 04:06:01 crc kubenswrapper[4685]: I0321 04:06:01.045407 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-5d6fc64879-r4p2z" event={"ID":"c05bf573-3e4a-4bee-8638-61b2c36dce22","Type":"ContainerStarted","Data":"4b2374ad56331236fea61b91e79b5f59d5b0f3e71d4d59513faa4d5d0dcd14a5"} Mar 21 04:06:01 crc kubenswrapper[4685]: I0321 04:06:01.045494 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-api-86df886d56-rnpb9" podUID="53cb4c49-8904-4b10-a275-2cbe3ca4257a" containerName="barbican-api-log" containerID="cri-o://0a0e12b3ad93c3db08827cb7032497442e7154b41ec6a1849798c184aa3185de" gracePeriod=30 Mar 21 04:06:01 crc kubenswrapper[4685]: I0321 04:06:01.045544 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-api-86df886d56-rnpb9" podUID="53cb4c49-8904-4b10-a275-2cbe3ca4257a" containerName="barbican-api" containerID="cri-o://a3f02f09bde54ff7a78bc8cbe1b31f76a02c72b6f30df8ce56db02499946cfe4" gracePeriod=30 Mar 21 04:06:01 crc kubenswrapper[4685]: I0321 04:06:01.045686 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-api-86df886d56-cjzpf" podUID="4cee7f36-cab3-42c7-a2d2-f3ffc81d85cd" containerName="barbican-api" containerID="cri-o://41cfd3811fe1482b9e8bab18a7d2f690647e35903683e88a424d17c6ffc124a6" gracePeriod=30 Mar 21 04:06:01 crc kubenswrapper[4685]: I0321 04:06:01.045682 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-api-86df886d56-cjzpf" podUID="4cee7f36-cab3-42c7-a2d2-f3ffc81d85cd" containerName="barbican-api-log" containerID="cri-o://511b55c99ad676623d4a4999ffd85f8233405261ac93dfd36569eca36a87ce1b" gracePeriod=30 Mar 21 04:06:01 crc kubenswrapper[4685]: I0321 04:06:01.045743 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-worker-5d6fc64879-r4p2z" podUID="c05bf573-3e4a-4bee-8638-61b2c36dce22" containerName="barbican-worker" containerID="cri-o://13ec1b041ce06de426f7f70f380db4dd4f8178435b8c10695fc72aeff679a45f" gracePeriod=30 Mar 21 04:06:01 crc kubenswrapper[4685]: I0321 04:06:01.045711 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-worker-5d6fc64879-r4p2z" podUID="c05bf573-3e4a-4bee-8638-61b2c36dce22" containerName="barbican-worker-log" containerID="cri-o://4b2374ad56331236fea61b91e79b5f59d5b0f3e71d4d59513faa4d5d0dcd14a5" gracePeriod=30 Mar 21 04:06:01 crc kubenswrapper[4685]: I0321 04:06:01.064977 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/barbican-worker-5d6fc64879-tmvxj" podStartSLOduration=2.254151325 podStartE2EDuration="4.064959023s" podCreationTimestamp="2026-03-21 04:05:57 +0000 UTC" firstStartedPulling="2026-03-21 04:05:57.957435813 +0000 UTC m=+1190.434504605" lastFinishedPulling="2026-03-21 04:05:59.768243521 +0000 UTC m=+1192.245312303" observedRunningTime="2026-03-21 04:06:01.059761028 +0000 UTC m=+1193.536829830" watchObservedRunningTime="2026-03-21 04:06:01.064959023 +0000 UTC m=+1193.542027815" Mar 21 04:06:01 crc kubenswrapper[4685]: I0321 04:06:01.089708 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/barbican-keystone-listener-566574dc7b-97wnd" podStartSLOduration=2.161545483 podStartE2EDuration="4.089689416s" podCreationTimestamp="2026-03-21 04:05:57 +0000 UTC" firstStartedPulling="2026-03-21 04:05:57.827930627 +0000 UTC m=+1190.304999419" lastFinishedPulling="2026-03-21 04:05:59.75607456 +0000 UTC m=+1192.233143352" observedRunningTime="2026-03-21 04:06:01.088088671 +0000 UTC m=+1193.565157463" watchObservedRunningTime="2026-03-21 04:06:01.089689416 +0000 UTC m=+1193.566758208" Mar 21 04:06:01 crc kubenswrapper[4685]: I0321 04:06:01.108436 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/barbican-keystone-listener-566574dc7b-8ppvx" podStartSLOduration=2.884061426 podStartE2EDuration="4.108420661s" podCreationTimestamp="2026-03-21 04:05:57 +0000 UTC" firstStartedPulling="2026-03-21 04:05:58.543820714 +0000 UTC m=+1191.020889516" lastFinishedPulling="2026-03-21 04:05:59.768179939 +0000 UTC m=+1192.245248751" observedRunningTime="2026-03-21 04:06:01.1037741 +0000 UTC m=+1193.580842902" watchObservedRunningTime="2026-03-21 04:06:01.108420661 +0000 UTC m=+1193.585489453" Mar 21 04:06:01 crc kubenswrapper[4685]: I0321 04:06:01.118717 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/barbican-worker-5d6fc64879-r4p2z" podStartSLOduration=3.194397825 podStartE2EDuration="4.118695498s" podCreationTimestamp="2026-03-21 04:05:57 +0000 UTC" firstStartedPulling="2026-03-21 04:05:58.853120455 +0000 UTC m=+1191.330189247" lastFinishedPulling="2026-03-21 04:05:59.777418118 +0000 UTC m=+1192.254486920" observedRunningTime="2026-03-21 04:06:01.117241558 +0000 UTC m=+1193.594310350" watchObservedRunningTime="2026-03-21 04:06:01.118695498 +0000 UTC m=+1193.595764290" Mar 21 04:06:02 crc kubenswrapper[4685]: I0321 04:06:02.055958 4685 generic.go:334] "Generic (PLEG): container finished" podID="c05bf573-3e4a-4bee-8638-61b2c36dce22" containerID="4b2374ad56331236fea61b91e79b5f59d5b0f3e71d4d59513faa4d5d0dcd14a5" exitCode=143 Mar 21 04:06:02 crc kubenswrapper[4685]: I0321 04:06:02.056319 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-5d6fc64879-r4p2z" event={"ID":"c05bf573-3e4a-4bee-8638-61b2c36dce22","Type":"ContainerDied","Data":"4b2374ad56331236fea61b91e79b5f59d5b0f3e71d4d59513faa4d5d0dcd14a5"} Mar 21 04:06:02 crc kubenswrapper[4685]: I0321 04:06:02.058750 4685 generic.go:334] "Generic (PLEG): container finished" podID="4cee7f36-cab3-42c7-a2d2-f3ffc81d85cd" containerID="41cfd3811fe1482b9e8bab18a7d2f690647e35903683e88a424d17c6ffc124a6" exitCode=0 Mar 21 04:06:02 crc kubenswrapper[4685]: I0321 04:06:02.058773 4685 generic.go:334] "Generic (PLEG): container finished" podID="4cee7f36-cab3-42c7-a2d2-f3ffc81d85cd" containerID="511b55c99ad676623d4a4999ffd85f8233405261ac93dfd36569eca36a87ce1b" exitCode=143 Mar 21 04:06:02 crc kubenswrapper[4685]: I0321 04:06:02.058805 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-86df886d56-cjzpf" event={"ID":"4cee7f36-cab3-42c7-a2d2-f3ffc81d85cd","Type":"ContainerDied","Data":"41cfd3811fe1482b9e8bab18a7d2f690647e35903683e88a424d17c6ffc124a6"} Mar 21 04:06:02 crc kubenswrapper[4685]: I0321 04:06:02.058825 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-86df886d56-cjzpf" event={"ID":"4cee7f36-cab3-42c7-a2d2-f3ffc81d85cd","Type":"ContainerDied","Data":"511b55c99ad676623d4a4999ffd85f8233405261ac93dfd36569eca36a87ce1b"} Mar 21 04:06:02 crc kubenswrapper[4685]: I0321 04:06:02.066411 4685 generic.go:334] "Generic (PLEG): container finished" podID="53cb4c49-8904-4b10-a275-2cbe3ca4257a" containerID="a3f02f09bde54ff7a78bc8cbe1b31f76a02c72b6f30df8ce56db02499946cfe4" exitCode=0 Mar 21 04:06:02 crc kubenswrapper[4685]: I0321 04:06:02.066466 4685 generic.go:334] "Generic (PLEG): container finished" podID="53cb4c49-8904-4b10-a275-2cbe3ca4257a" containerID="0a0e12b3ad93c3db08827cb7032497442e7154b41ec6a1849798c184aa3185de" exitCode=143 Mar 21 04:06:02 crc kubenswrapper[4685]: I0321 04:06:02.066569 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-86df886d56-rnpb9" event={"ID":"53cb4c49-8904-4b10-a275-2cbe3ca4257a","Type":"ContainerDied","Data":"a3f02f09bde54ff7a78bc8cbe1b31f76a02c72b6f30df8ce56db02499946cfe4"} Mar 21 04:06:02 crc kubenswrapper[4685]: I0321 04:06:02.066604 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-86df886d56-rnpb9" event={"ID":"53cb4c49-8904-4b10-a275-2cbe3ca4257a","Type":"ContainerDied","Data":"0a0e12b3ad93c3db08827cb7032497442e7154b41ec6a1849798c184aa3185de"} Mar 21 04:06:02 crc kubenswrapper[4685]: I0321 04:06:02.078000 4685 generic.go:334] "Generic (PLEG): container finished" podID="cf854e07-f0f9-4160-9dbc-ccda00f50a21" containerID="3635458b4890450e0c9674ed9954b74779c9f2f2247462d1d69aa7931c450202" exitCode=143 Mar 21 04:06:02 crc kubenswrapper[4685]: I0321 04:06:02.078188 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-worker-5d6fc64879-tmvxj" podUID="6b2406cc-b010-443b-89e6-2dc27034a38e" containerName="barbican-worker-log" containerID="cri-o://eb7c2fd6ed1b34e95369b0cbf54f469189e85cc26fdf1d690d3b5b1575914134" gracePeriod=30 Mar 21 04:06:02 crc kubenswrapper[4685]: I0321 04:06:02.078692 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-566574dc7b-8ppvx" event={"ID":"cf854e07-f0f9-4160-9dbc-ccda00f50a21","Type":"ContainerDied","Data":"3635458b4890450e0c9674ed9954b74779c9f2f2247462d1d69aa7931c450202"} Mar 21 04:06:02 crc kubenswrapper[4685]: I0321 04:06:02.079026 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-keystone-listener-566574dc7b-97wnd" podUID="62f3656b-1837-4b86-a106-923137d04fcf" containerName="barbican-keystone-listener-log" containerID="cri-o://229530ae4cfdea08483f46a407d1f32e4c9cc897d4da146322b58ee6cc87b2c4" gracePeriod=30 Mar 21 04:06:02 crc kubenswrapper[4685]: I0321 04:06:02.079383 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-worker-5d6fc64879-tmvxj" podUID="6b2406cc-b010-443b-89e6-2dc27034a38e" containerName="barbican-worker" containerID="cri-o://f0704f16170a637c23b471ddea3dc2599fa06ced5e76f4dbff23493f106d22c0" gracePeriod=30 Mar 21 04:06:02 crc kubenswrapper[4685]: I0321 04:06:02.079458 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-keystone-listener-566574dc7b-97wnd" podUID="62f3656b-1837-4b86-a106-923137d04fcf" containerName="barbican-keystone-listener" containerID="cri-o://81a3c54be7bf2f00c15c57402ff5a31d220690fe6a9989ad300ffd170fb8d001" gracePeriod=30 Mar 21 04:06:02 crc kubenswrapper[4685]: I0321 04:06:02.205011 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-db-sync-jgnjz"] Mar 21 04:06:02 crc kubenswrapper[4685]: I0321 04:06:02.219968 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/barbican-db-sync-jgnjz"] Mar 21 04:06:02 crc kubenswrapper[4685]: I0321 04:06:02.222139 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-api-86df886d56-cjzpf" Mar 21 04:06:02 crc kubenswrapper[4685]: I0321 04:06:02.228369 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-api-86df886d56-rnpb9" Mar 21 04:06:02 crc kubenswrapper[4685]: I0321 04:06:02.252621 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/barbican1eeb-account-delete-lckg8"] Mar 21 04:06:02 crc kubenswrapper[4685]: E0321 04:06:02.258319 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53cb4c49-8904-4b10-a275-2cbe3ca4257a" containerName="barbican-api" Mar 21 04:06:02 crc kubenswrapper[4685]: I0321 04:06:02.258351 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="53cb4c49-8904-4b10-a275-2cbe3ca4257a" containerName="barbican-api" Mar 21 04:06:02 crc kubenswrapper[4685]: E0321 04:06:02.258375 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cee7f36-cab3-42c7-a2d2-f3ffc81d85cd" containerName="barbican-api-log" Mar 21 04:06:02 crc kubenswrapper[4685]: I0321 04:06:02.258382 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cee7f36-cab3-42c7-a2d2-f3ffc81d85cd" containerName="barbican-api-log" Mar 21 04:06:02 crc kubenswrapper[4685]: E0321 04:06:02.258400 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cee7f36-cab3-42c7-a2d2-f3ffc81d85cd" containerName="barbican-api" Mar 21 04:06:02 crc kubenswrapper[4685]: I0321 04:06:02.258406 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cee7f36-cab3-42c7-a2d2-f3ffc81d85cd" containerName="barbican-api" Mar 21 04:06:02 crc kubenswrapper[4685]: E0321 04:06:02.258416 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53cb4c49-8904-4b10-a275-2cbe3ca4257a" containerName="barbican-api-log" Mar 21 04:06:02 crc kubenswrapper[4685]: I0321 04:06:02.258421 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="53cb4c49-8904-4b10-a275-2cbe3ca4257a" containerName="barbican-api-log" Mar 21 04:06:02 crc kubenswrapper[4685]: I0321 04:06:02.258588 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cee7f36-cab3-42c7-a2d2-f3ffc81d85cd" containerName="barbican-api-log" Mar 21 04:06:02 crc kubenswrapper[4685]: I0321 04:06:02.258599 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="53cb4c49-8904-4b10-a275-2cbe3ca4257a" containerName="barbican-api" Mar 21 04:06:02 crc kubenswrapper[4685]: I0321 04:06:02.258612 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cee7f36-cab3-42c7-a2d2-f3ffc81d85cd" containerName="barbican-api" Mar 21 04:06:02 crc kubenswrapper[4685]: I0321 04:06:02.258625 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="53cb4c49-8904-4b10-a275-2cbe3ca4257a" containerName="barbican-api-log" Mar 21 04:06:02 crc kubenswrapper[4685]: I0321 04:06:02.259053 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican1eeb-account-delete-lckg8" Mar 21 04:06:02 crc kubenswrapper[4685]: I0321 04:06:02.273549 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican1eeb-account-delete-lckg8"] Mar 21 04:06:02 crc kubenswrapper[4685]: I0321 04:06:02.332174 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef3ba9cc-12c1-4e77-9d4e-ef59eff1ad6e" path="/var/lib/kubelet/pods/ef3ba9cc-12c1-4e77-9d4e-ef59eff1ad6e/volumes" Mar 21 04:06:02 crc kubenswrapper[4685]: I0321 04:06:02.370660 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4cee7f36-cab3-42c7-a2d2-f3ffc81d85cd-logs\") pod \"4cee7f36-cab3-42c7-a2d2-f3ffc81d85cd\" (UID: \"4cee7f36-cab3-42c7-a2d2-f3ffc81d85cd\") " Mar 21 04:06:02 crc kubenswrapper[4685]: I0321 04:06:02.370911 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53cb4c49-8904-4b10-a275-2cbe3ca4257a-logs\") pod \"53cb4c49-8904-4b10-a275-2cbe3ca4257a\" (UID: \"53cb4c49-8904-4b10-a275-2cbe3ca4257a\") " Mar 21 04:06:02 crc kubenswrapper[4685]: I0321 04:06:02.370996 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cee7f36-cab3-42c7-a2d2-f3ffc81d85cd-config-data\") pod \"4cee7f36-cab3-42c7-a2d2-f3ffc81d85cd\" (UID: \"4cee7f36-cab3-42c7-a2d2-f3ffc81d85cd\") " Mar 21 04:06:02 crc kubenswrapper[4685]: I0321 04:06:02.371120 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jn97n\" (UniqueName: \"kubernetes.io/projected/53cb4c49-8904-4b10-a275-2cbe3ca4257a-kube-api-access-jn97n\") pod \"53cb4c49-8904-4b10-a275-2cbe3ca4257a\" (UID: \"53cb4c49-8904-4b10-a275-2cbe3ca4257a\") " Mar 21 04:06:02 crc kubenswrapper[4685]: I0321 04:06:02.371244 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/53cb4c49-8904-4b10-a275-2cbe3ca4257a-config-data-custom\") pod \"53cb4c49-8904-4b10-a275-2cbe3ca4257a\" (UID: \"53cb4c49-8904-4b10-a275-2cbe3ca4257a\") " Mar 21 04:06:02 crc kubenswrapper[4685]: I0321 04:06:02.371316 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53cb4c49-8904-4b10-a275-2cbe3ca4257a-config-data\") pod \"53cb4c49-8904-4b10-a275-2cbe3ca4257a\" (UID: \"53cb4c49-8904-4b10-a275-2cbe3ca4257a\") " Mar 21 04:06:02 crc kubenswrapper[4685]: I0321 04:06:02.371404 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4cee7f36-cab3-42c7-a2d2-f3ffc81d85cd-config-data-custom\") pod \"4cee7f36-cab3-42c7-a2d2-f3ffc81d85cd\" (UID: \"4cee7f36-cab3-42c7-a2d2-f3ffc81d85cd\") " Mar 21 04:06:02 crc kubenswrapper[4685]: I0321 04:06:02.371457 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4cee7f36-cab3-42c7-a2d2-f3ffc81d85cd-logs" (OuterVolumeSpecName: "logs") pod "4cee7f36-cab3-42c7-a2d2-f3ffc81d85cd" (UID: "4cee7f36-cab3-42c7-a2d2-f3ffc81d85cd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:06:02 crc kubenswrapper[4685]: I0321 04:06:02.371571 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jlh7z\" (UniqueName: \"kubernetes.io/projected/4cee7f36-cab3-42c7-a2d2-f3ffc81d85cd-kube-api-access-jlh7z\") pod \"4cee7f36-cab3-42c7-a2d2-f3ffc81d85cd\" (UID: \"4cee7f36-cab3-42c7-a2d2-f3ffc81d85cd\") " Mar 21 04:06:02 crc kubenswrapper[4685]: I0321 04:06:02.371868 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4n82d\" (UniqueName: \"kubernetes.io/projected/1a9bdf2c-4aab-4f9a-9c39-30c961e3cd29-kube-api-access-4n82d\") pod \"barbican1eeb-account-delete-lckg8\" (UID: \"1a9bdf2c-4aab-4f9a-9c39-30c961e3cd29\") " pod="barbican-kuttl-tests/barbican1eeb-account-delete-lckg8" Mar 21 04:06:02 crc kubenswrapper[4685]: I0321 04:06:02.371988 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1a9bdf2c-4aab-4f9a-9c39-30c961e3cd29-operator-scripts\") pod \"barbican1eeb-account-delete-lckg8\" (UID: \"1a9bdf2c-4aab-4f9a-9c39-30c961e3cd29\") " pod="barbican-kuttl-tests/barbican1eeb-account-delete-lckg8" Mar 21 04:06:02 crc kubenswrapper[4685]: I0321 04:06:02.372114 4685 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4cee7f36-cab3-42c7-a2d2-f3ffc81d85cd-logs\") on node \"crc\" DevicePath \"\"" Mar 21 04:06:02 crc kubenswrapper[4685]: I0321 04:06:02.372495 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53cb4c49-8904-4b10-a275-2cbe3ca4257a-logs" (OuterVolumeSpecName: "logs") pod "53cb4c49-8904-4b10-a275-2cbe3ca4257a" (UID: "53cb4c49-8904-4b10-a275-2cbe3ca4257a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:06:02 crc kubenswrapper[4685]: I0321 04:06:02.381282 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53cb4c49-8904-4b10-a275-2cbe3ca4257a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "53cb4c49-8904-4b10-a275-2cbe3ca4257a" (UID: "53cb4c49-8904-4b10-a275-2cbe3ca4257a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:06:02 crc kubenswrapper[4685]: I0321 04:06:02.388535 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cee7f36-cab3-42c7-a2d2-f3ffc81d85cd-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "4cee7f36-cab3-42c7-a2d2-f3ffc81d85cd" (UID: "4cee7f36-cab3-42c7-a2d2-f3ffc81d85cd"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:06:02 crc kubenswrapper[4685]: I0321 04:06:02.388568 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cee7f36-cab3-42c7-a2d2-f3ffc81d85cd-kube-api-access-jlh7z" (OuterVolumeSpecName: "kube-api-access-jlh7z") pod "4cee7f36-cab3-42c7-a2d2-f3ffc81d85cd" (UID: "4cee7f36-cab3-42c7-a2d2-f3ffc81d85cd"). InnerVolumeSpecName "kube-api-access-jlh7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:06:02 crc kubenswrapper[4685]: I0321 04:06:02.390609 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53cb4c49-8904-4b10-a275-2cbe3ca4257a-kube-api-access-jn97n" (OuterVolumeSpecName: "kube-api-access-jn97n") pod "53cb4c49-8904-4b10-a275-2cbe3ca4257a" (UID: "53cb4c49-8904-4b10-a275-2cbe3ca4257a"). InnerVolumeSpecName "kube-api-access-jn97n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:06:02 crc kubenswrapper[4685]: I0321 04:06:02.414002 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cee7f36-cab3-42c7-a2d2-f3ffc81d85cd-config-data" (OuterVolumeSpecName: "config-data") pod "4cee7f36-cab3-42c7-a2d2-f3ffc81d85cd" (UID: "4cee7f36-cab3-42c7-a2d2-f3ffc81d85cd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:06:02 crc kubenswrapper[4685]: I0321 04:06:02.426963 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53cb4c49-8904-4b10-a275-2cbe3ca4257a-config-data" (OuterVolumeSpecName: "config-data") pod "53cb4c49-8904-4b10-a275-2cbe3ca4257a" (UID: "53cb4c49-8904-4b10-a275-2cbe3ca4257a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:06:02 crc kubenswrapper[4685]: I0321 04:06:02.473339 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1a9bdf2c-4aab-4f9a-9c39-30c961e3cd29-operator-scripts\") pod \"barbican1eeb-account-delete-lckg8\" (UID: \"1a9bdf2c-4aab-4f9a-9c39-30c961e3cd29\") " pod="barbican-kuttl-tests/barbican1eeb-account-delete-lckg8" Mar 21 04:06:02 crc kubenswrapper[4685]: I0321 04:06:02.473513 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4n82d\" (UniqueName: \"kubernetes.io/projected/1a9bdf2c-4aab-4f9a-9c39-30c961e3cd29-kube-api-access-4n82d\") pod \"barbican1eeb-account-delete-lckg8\" (UID: \"1a9bdf2c-4aab-4f9a-9c39-30c961e3cd29\") " pod="barbican-kuttl-tests/barbican1eeb-account-delete-lckg8" Mar 21 04:06:02 crc kubenswrapper[4685]: I0321 04:06:02.473588 4685 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53cb4c49-8904-4b10-a275-2cbe3ca4257a-logs\") on node \"crc\" DevicePath \"\"" Mar 21 04:06:02 crc kubenswrapper[4685]: I0321 04:06:02.473620 4685 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cee7f36-cab3-42c7-a2d2-f3ffc81d85cd-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:06:02 crc kubenswrapper[4685]: I0321 04:06:02.473634 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jn97n\" (UniqueName: \"kubernetes.io/projected/53cb4c49-8904-4b10-a275-2cbe3ca4257a-kube-api-access-jn97n\") on node \"crc\" DevicePath \"\"" Mar 21 04:06:02 crc kubenswrapper[4685]: I0321 04:06:02.473646 4685 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/53cb4c49-8904-4b10-a275-2cbe3ca4257a-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 21 04:06:02 crc kubenswrapper[4685]: I0321 04:06:02.473658 4685 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53cb4c49-8904-4b10-a275-2cbe3ca4257a-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:06:02 crc kubenswrapper[4685]: I0321 04:06:02.473669 4685 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4cee7f36-cab3-42c7-a2d2-f3ffc81d85cd-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 21 04:06:02 crc kubenswrapper[4685]: I0321 04:06:02.473682 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jlh7z\" (UniqueName: \"kubernetes.io/projected/4cee7f36-cab3-42c7-a2d2-f3ffc81d85cd-kube-api-access-jlh7z\") on node \"crc\" DevicePath \"\"" Mar 21 04:06:02 crc kubenswrapper[4685]: I0321 04:06:02.474192 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1a9bdf2c-4aab-4f9a-9c39-30c961e3cd29-operator-scripts\") pod \"barbican1eeb-account-delete-lckg8\" (UID: \"1a9bdf2c-4aab-4f9a-9c39-30c961e3cd29\") " pod="barbican-kuttl-tests/barbican1eeb-account-delete-lckg8" Mar 21 04:06:02 crc kubenswrapper[4685]: I0321 04:06:02.491622 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4n82d\" (UniqueName: \"kubernetes.io/projected/1a9bdf2c-4aab-4f9a-9c39-30c961e3cd29-kube-api-access-4n82d\") pod \"barbican1eeb-account-delete-lckg8\" (UID: \"1a9bdf2c-4aab-4f9a-9c39-30c961e3cd29\") " pod="barbican-kuttl-tests/barbican1eeb-account-delete-lckg8" Mar 21 04:06:02 crc kubenswrapper[4685]: I0321 04:06:02.574283 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican1eeb-account-delete-lckg8" Mar 21 04:06:03 crc kubenswrapper[4685]: I0321 04:06:03.052483 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-worker-5d6fc64879-tmvxj" Mar 21 04:06:03 crc kubenswrapper[4685]: I0321 04:06:03.061682 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican1eeb-account-delete-lckg8"] Mar 21 04:06:03 crc kubenswrapper[4685]: W0321 04:06:03.065312 4685 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1a9bdf2c_4aab_4f9a_9c39_30c961e3cd29.slice/crio-aed1e2736fc137709f6d9870dc7c90c077e736b571c337b96bf084e730ef4ea2 WatchSource:0}: Error finding container aed1e2736fc137709f6d9870dc7c90c077e736b571c337b96bf084e730ef4ea2: Status 404 returned error can't find the container with id aed1e2736fc137709f6d9870dc7c90c077e736b571c337b96bf084e730ef4ea2 Mar 21 04:06:03 crc kubenswrapper[4685]: I0321 04:06:03.087299 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican1eeb-account-delete-lckg8" event={"ID":"1a9bdf2c-4aab-4f9a-9c39-30c961e3cd29","Type":"ContainerStarted","Data":"aed1e2736fc137709f6d9870dc7c90c077e736b571c337b96bf084e730ef4ea2"} Mar 21 04:06:03 crc kubenswrapper[4685]: I0321 04:06:03.089041 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-api-86df886d56-cjzpf" Mar 21 04:06:03 crc kubenswrapper[4685]: I0321 04:06:03.093371 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-86df886d56-cjzpf" event={"ID":"4cee7f36-cab3-42c7-a2d2-f3ffc81d85cd","Type":"ContainerDied","Data":"f9be55e5f7ace865932b5e0ee184451b2dc3af24db16ddab9c603364a319eb4c"} Mar 21 04:06:03 crc kubenswrapper[4685]: I0321 04:06:03.093436 4685 scope.go:117] "RemoveContainer" containerID="41cfd3811fe1482b9e8bab18a7d2f690647e35903683e88a424d17c6ffc124a6" Mar 21 04:06:03 crc kubenswrapper[4685]: I0321 04:06:03.107273 4685 generic.go:334] "Generic (PLEG): container finished" podID="6b2406cc-b010-443b-89e6-2dc27034a38e" containerID="f0704f16170a637c23b471ddea3dc2599fa06ced5e76f4dbff23493f106d22c0" exitCode=0 Mar 21 04:06:03 crc kubenswrapper[4685]: I0321 04:06:03.107336 4685 generic.go:334] "Generic (PLEG): container finished" podID="6b2406cc-b010-443b-89e6-2dc27034a38e" containerID="eb7c2fd6ed1b34e95369b0cbf54f469189e85cc26fdf1d690d3b5b1575914134" exitCode=143 Mar 21 04:06:03 crc kubenswrapper[4685]: I0321 04:06:03.107409 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-worker-5d6fc64879-tmvxj" Mar 21 04:06:03 crc kubenswrapper[4685]: I0321 04:06:03.107496 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-5d6fc64879-tmvxj" event={"ID":"6b2406cc-b010-443b-89e6-2dc27034a38e","Type":"ContainerDied","Data":"f0704f16170a637c23b471ddea3dc2599fa06ced5e76f4dbff23493f106d22c0"} Mar 21 04:06:03 crc kubenswrapper[4685]: I0321 04:06:03.107560 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-5d6fc64879-tmvxj" event={"ID":"6b2406cc-b010-443b-89e6-2dc27034a38e","Type":"ContainerDied","Data":"eb7c2fd6ed1b34e95369b0cbf54f469189e85cc26fdf1d690d3b5b1575914134"} Mar 21 04:06:03 crc kubenswrapper[4685]: I0321 04:06:03.107583 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-5d6fc64879-tmvxj" event={"ID":"6b2406cc-b010-443b-89e6-2dc27034a38e","Type":"ContainerDied","Data":"5b1b60452566f218c1a4bd28c85f108d5a2671d504f81a63bcba22d4bd723095"} Mar 21 04:06:03 crc kubenswrapper[4685]: I0321 04:06:03.111707 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-86df886d56-rnpb9" event={"ID":"53cb4c49-8904-4b10-a275-2cbe3ca4257a","Type":"ContainerDied","Data":"05316b1a67a8c6432577a0b04c2ff6317d0048d6367d5a463e8116c887d7002d"} Mar 21 04:06:03 crc kubenswrapper[4685]: I0321 04:06:03.111739 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-api-86df886d56-rnpb9" Mar 21 04:06:03 crc kubenswrapper[4685]: I0321 04:06:03.113634 4685 generic.go:334] "Generic (PLEG): container finished" podID="62f3656b-1837-4b86-a106-923137d04fcf" containerID="81a3c54be7bf2f00c15c57402ff5a31d220690fe6a9989ad300ffd170fb8d001" exitCode=0 Mar 21 04:06:03 crc kubenswrapper[4685]: I0321 04:06:03.113663 4685 generic.go:334] "Generic (PLEG): container finished" podID="62f3656b-1837-4b86-a106-923137d04fcf" containerID="229530ae4cfdea08483f46a407d1f32e4c9cc897d4da146322b58ee6cc87b2c4" exitCode=143 Mar 21 04:06:03 crc kubenswrapper[4685]: I0321 04:06:03.113720 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-566574dc7b-97wnd" event={"ID":"62f3656b-1837-4b86-a106-923137d04fcf","Type":"ContainerDied","Data":"81a3c54be7bf2f00c15c57402ff5a31d220690fe6a9989ad300ffd170fb8d001"} Mar 21 04:06:03 crc kubenswrapper[4685]: I0321 04:06:03.113745 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-566574dc7b-97wnd" event={"ID":"62f3656b-1837-4b86-a106-923137d04fcf","Type":"ContainerDied","Data":"229530ae4cfdea08483f46a407d1f32e4c9cc897d4da146322b58ee6cc87b2c4"} Mar 21 04:06:03 crc kubenswrapper[4685]: I0321 04:06:03.131577 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-api-86df886d56-cjzpf"] Mar 21 04:06:03 crc kubenswrapper[4685]: I0321 04:06:03.140551 4685 generic.go:334] "Generic (PLEG): container finished" podID="7bbb4a07-790d-41ce-a9d0-40dd729bba6e" containerID="5e9acff5a2e10e21a9d5026b7bb3068bf20c1f1220a53a874327c1e90be60013" exitCode=0 Mar 21 04:06:03 crc kubenswrapper[4685]: I0321 04:06:03.140615 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567766-lgcgr" event={"ID":"7bbb4a07-790d-41ce-a9d0-40dd729bba6e","Type":"ContainerDied","Data":"5e9acff5a2e10e21a9d5026b7bb3068bf20c1f1220a53a874327c1e90be60013"} Mar 21 04:06:03 crc kubenswrapper[4685]: I0321 04:06:03.142820 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/barbican-api-86df886d56-cjzpf"] Mar 21 04:06:03 crc kubenswrapper[4685]: I0321 04:06:03.151214 4685 scope.go:117] "RemoveContainer" containerID="511b55c99ad676623d4a4999ffd85f8233405261ac93dfd36569eca36a87ce1b" Mar 21 04:06:03 crc kubenswrapper[4685]: I0321 04:06:03.186933 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hx8q8\" (UniqueName: \"kubernetes.io/projected/6b2406cc-b010-443b-89e6-2dc27034a38e-kube-api-access-hx8q8\") pod \"6b2406cc-b010-443b-89e6-2dc27034a38e\" (UID: \"6b2406cc-b010-443b-89e6-2dc27034a38e\") " Mar 21 04:06:03 crc kubenswrapper[4685]: I0321 04:06:03.186967 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6b2406cc-b010-443b-89e6-2dc27034a38e-config-data-custom\") pod \"6b2406cc-b010-443b-89e6-2dc27034a38e\" (UID: \"6b2406cc-b010-443b-89e6-2dc27034a38e\") " Mar 21 04:06:03 crc kubenswrapper[4685]: I0321 04:06:03.187012 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b2406cc-b010-443b-89e6-2dc27034a38e-config-data\") pod \"6b2406cc-b010-443b-89e6-2dc27034a38e\" (UID: \"6b2406cc-b010-443b-89e6-2dc27034a38e\") " Mar 21 04:06:03 crc kubenswrapper[4685]: I0321 04:06:03.187090 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b2406cc-b010-443b-89e6-2dc27034a38e-logs\") pod \"6b2406cc-b010-443b-89e6-2dc27034a38e\" (UID: \"6b2406cc-b010-443b-89e6-2dc27034a38e\") " Mar 21 04:06:03 crc kubenswrapper[4685]: I0321 04:06:03.187915 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b2406cc-b010-443b-89e6-2dc27034a38e-logs" (OuterVolumeSpecName: "logs") pod "6b2406cc-b010-443b-89e6-2dc27034a38e" (UID: "6b2406cc-b010-443b-89e6-2dc27034a38e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:06:03 crc kubenswrapper[4685]: I0321 04:06:03.197256 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b2406cc-b010-443b-89e6-2dc27034a38e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "6b2406cc-b010-443b-89e6-2dc27034a38e" (UID: "6b2406cc-b010-443b-89e6-2dc27034a38e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:06:03 crc kubenswrapper[4685]: I0321 04:06:03.201023 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b2406cc-b010-443b-89e6-2dc27034a38e-kube-api-access-hx8q8" (OuterVolumeSpecName: "kube-api-access-hx8q8") pod "6b2406cc-b010-443b-89e6-2dc27034a38e" (UID: "6b2406cc-b010-443b-89e6-2dc27034a38e"). InnerVolumeSpecName "kube-api-access-hx8q8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:06:03 crc kubenswrapper[4685]: I0321 04:06:03.210549 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-api-86df886d56-rnpb9"] Mar 21 04:06:03 crc kubenswrapper[4685]: I0321 04:06:03.210832 4685 scope.go:117] "RemoveContainer" containerID="f0704f16170a637c23b471ddea3dc2599fa06ced5e76f4dbff23493f106d22c0" Mar 21 04:06:03 crc kubenswrapper[4685]: I0321 04:06:03.216697 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/barbican-api-86df886d56-rnpb9"] Mar 21 04:06:03 crc kubenswrapper[4685]: I0321 04:06:03.235171 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-keystone-listener-566574dc7b-97wnd" Mar 21 04:06:03 crc kubenswrapper[4685]: I0321 04:06:03.242255 4685 scope.go:117] "RemoveContainer" containerID="eb7c2fd6ed1b34e95369b0cbf54f469189e85cc26fdf1d690d3b5b1575914134" Mar 21 04:06:03 crc kubenswrapper[4685]: I0321 04:06:03.252941 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b2406cc-b010-443b-89e6-2dc27034a38e-config-data" (OuterVolumeSpecName: "config-data") pod "6b2406cc-b010-443b-89e6-2dc27034a38e" (UID: "6b2406cc-b010-443b-89e6-2dc27034a38e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:06:03 crc kubenswrapper[4685]: I0321 04:06:03.272820 4685 scope.go:117] "RemoveContainer" containerID="f0704f16170a637c23b471ddea3dc2599fa06ced5e76f4dbff23493f106d22c0" Mar 21 04:06:03 crc kubenswrapper[4685]: E0321 04:06:03.274062 4685 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0704f16170a637c23b471ddea3dc2599fa06ced5e76f4dbff23493f106d22c0\": container with ID starting with f0704f16170a637c23b471ddea3dc2599fa06ced5e76f4dbff23493f106d22c0 not found: ID does not exist" containerID="f0704f16170a637c23b471ddea3dc2599fa06ced5e76f4dbff23493f106d22c0" Mar 21 04:06:03 crc kubenswrapper[4685]: I0321 04:06:03.274104 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0704f16170a637c23b471ddea3dc2599fa06ced5e76f4dbff23493f106d22c0"} err="failed to get container status \"f0704f16170a637c23b471ddea3dc2599fa06ced5e76f4dbff23493f106d22c0\": rpc error: code = NotFound desc = could not find container \"f0704f16170a637c23b471ddea3dc2599fa06ced5e76f4dbff23493f106d22c0\": container with ID starting with f0704f16170a637c23b471ddea3dc2599fa06ced5e76f4dbff23493f106d22c0 not found: ID does not exist" Mar 21 04:06:03 crc kubenswrapper[4685]: I0321 04:06:03.274129 4685 scope.go:117] "RemoveContainer" containerID="eb7c2fd6ed1b34e95369b0cbf54f469189e85cc26fdf1d690d3b5b1575914134" Mar 21 04:06:03 crc kubenswrapper[4685]: E0321 04:06:03.274796 4685 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb7c2fd6ed1b34e95369b0cbf54f469189e85cc26fdf1d690d3b5b1575914134\": container with ID starting with eb7c2fd6ed1b34e95369b0cbf54f469189e85cc26fdf1d690d3b5b1575914134 not found: ID does not exist" containerID="eb7c2fd6ed1b34e95369b0cbf54f469189e85cc26fdf1d690d3b5b1575914134" Mar 21 04:06:03 crc kubenswrapper[4685]: I0321 04:06:03.274857 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb7c2fd6ed1b34e95369b0cbf54f469189e85cc26fdf1d690d3b5b1575914134"} err="failed to get container status \"eb7c2fd6ed1b34e95369b0cbf54f469189e85cc26fdf1d690d3b5b1575914134\": rpc error: code = NotFound desc = could not find container \"eb7c2fd6ed1b34e95369b0cbf54f469189e85cc26fdf1d690d3b5b1575914134\": container with ID starting with eb7c2fd6ed1b34e95369b0cbf54f469189e85cc26fdf1d690d3b5b1575914134 not found: ID does not exist" Mar 21 04:06:03 crc kubenswrapper[4685]: I0321 04:06:03.274873 4685 scope.go:117] "RemoveContainer" containerID="f0704f16170a637c23b471ddea3dc2599fa06ced5e76f4dbff23493f106d22c0" Mar 21 04:06:03 crc kubenswrapper[4685]: I0321 04:06:03.275067 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0704f16170a637c23b471ddea3dc2599fa06ced5e76f4dbff23493f106d22c0"} err="failed to get container status \"f0704f16170a637c23b471ddea3dc2599fa06ced5e76f4dbff23493f106d22c0\": rpc error: code = NotFound desc = could not find container \"f0704f16170a637c23b471ddea3dc2599fa06ced5e76f4dbff23493f106d22c0\": container with ID starting with f0704f16170a637c23b471ddea3dc2599fa06ced5e76f4dbff23493f106d22c0 not found: ID does not exist" Mar 21 04:06:03 crc kubenswrapper[4685]: I0321 04:06:03.275082 4685 scope.go:117] "RemoveContainer" containerID="eb7c2fd6ed1b34e95369b0cbf54f469189e85cc26fdf1d690d3b5b1575914134" Mar 21 04:06:03 crc kubenswrapper[4685]: I0321 04:06:03.275312 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb7c2fd6ed1b34e95369b0cbf54f469189e85cc26fdf1d690d3b5b1575914134"} err="failed to get container status \"eb7c2fd6ed1b34e95369b0cbf54f469189e85cc26fdf1d690d3b5b1575914134\": rpc error: code = NotFound desc = could not find container \"eb7c2fd6ed1b34e95369b0cbf54f469189e85cc26fdf1d690d3b5b1575914134\": container with ID starting with eb7c2fd6ed1b34e95369b0cbf54f469189e85cc26fdf1d690d3b5b1575914134 not found: ID does not exist" Mar 21 04:06:03 crc kubenswrapper[4685]: I0321 04:06:03.275330 4685 scope.go:117] "RemoveContainer" containerID="a3f02f09bde54ff7a78bc8cbe1b31f76a02c72b6f30df8ce56db02499946cfe4" Mar 21 04:06:03 crc kubenswrapper[4685]: I0321 04:06:03.289728 4685 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b2406cc-b010-443b-89e6-2dc27034a38e-logs\") on node \"crc\" DevicePath \"\"" Mar 21 04:06:03 crc kubenswrapper[4685]: I0321 04:06:03.289762 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hx8q8\" (UniqueName: \"kubernetes.io/projected/6b2406cc-b010-443b-89e6-2dc27034a38e-kube-api-access-hx8q8\") on node \"crc\" DevicePath \"\"" Mar 21 04:06:03 crc kubenswrapper[4685]: I0321 04:06:03.289817 4685 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6b2406cc-b010-443b-89e6-2dc27034a38e-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 21 04:06:03 crc kubenswrapper[4685]: I0321 04:06:03.289890 4685 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b2406cc-b010-443b-89e6-2dc27034a38e-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:06:03 crc kubenswrapper[4685]: I0321 04:06:03.302604 4685 scope.go:117] "RemoveContainer" containerID="0a0e12b3ad93c3db08827cb7032497442e7154b41ec6a1849798c184aa3185de" Mar 21 04:06:03 crc kubenswrapper[4685]: I0321 04:06:03.390608 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/62f3656b-1837-4b86-a106-923137d04fcf-config-data-custom\") pod \"62f3656b-1837-4b86-a106-923137d04fcf\" (UID: \"62f3656b-1837-4b86-a106-923137d04fcf\") " Mar 21 04:06:03 crc kubenswrapper[4685]: I0321 04:06:03.390716 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62f3656b-1837-4b86-a106-923137d04fcf-config-data\") pod \"62f3656b-1837-4b86-a106-923137d04fcf\" (UID: \"62f3656b-1837-4b86-a106-923137d04fcf\") " Mar 21 04:06:03 crc kubenswrapper[4685]: I0321 04:06:03.390793 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7f6w9\" (UniqueName: \"kubernetes.io/projected/62f3656b-1837-4b86-a106-923137d04fcf-kube-api-access-7f6w9\") pod \"62f3656b-1837-4b86-a106-923137d04fcf\" (UID: \"62f3656b-1837-4b86-a106-923137d04fcf\") " Mar 21 04:06:03 crc kubenswrapper[4685]: I0321 04:06:03.390859 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62f3656b-1837-4b86-a106-923137d04fcf-logs\") pod \"62f3656b-1837-4b86-a106-923137d04fcf\" (UID: \"62f3656b-1837-4b86-a106-923137d04fcf\") " Mar 21 04:06:03 crc kubenswrapper[4685]: I0321 04:06:03.391684 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62f3656b-1837-4b86-a106-923137d04fcf-logs" (OuterVolumeSpecName: "logs") pod "62f3656b-1837-4b86-a106-923137d04fcf" (UID: "62f3656b-1837-4b86-a106-923137d04fcf"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:06:03 crc kubenswrapper[4685]: I0321 04:06:03.394370 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62f3656b-1837-4b86-a106-923137d04fcf-kube-api-access-7f6w9" (OuterVolumeSpecName: "kube-api-access-7f6w9") pod "62f3656b-1837-4b86-a106-923137d04fcf" (UID: "62f3656b-1837-4b86-a106-923137d04fcf"). InnerVolumeSpecName "kube-api-access-7f6w9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:06:03 crc kubenswrapper[4685]: I0321 04:06:03.396507 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62f3656b-1837-4b86-a106-923137d04fcf-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "62f3656b-1837-4b86-a106-923137d04fcf" (UID: "62f3656b-1837-4b86-a106-923137d04fcf"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:06:03 crc kubenswrapper[4685]: I0321 04:06:03.426107 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62f3656b-1837-4b86-a106-923137d04fcf-config-data" (OuterVolumeSpecName: "config-data") pod "62f3656b-1837-4b86-a106-923137d04fcf" (UID: "62f3656b-1837-4b86-a106-923137d04fcf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:06:03 crc kubenswrapper[4685]: I0321 04:06:03.439472 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-worker-5d6fc64879-tmvxj"] Mar 21 04:06:03 crc kubenswrapper[4685]: I0321 04:06:03.446110 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/barbican-worker-5d6fc64879-tmvxj"] Mar 21 04:06:03 crc kubenswrapper[4685]: I0321 04:06:03.492321 4685 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62f3656b-1837-4b86-a106-923137d04fcf-logs\") on node \"crc\" DevicePath \"\"" Mar 21 04:06:03 crc kubenswrapper[4685]: I0321 04:06:03.492522 4685 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/62f3656b-1837-4b86-a106-923137d04fcf-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 21 04:06:03 crc kubenswrapper[4685]: I0321 04:06:03.492600 4685 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62f3656b-1837-4b86-a106-923137d04fcf-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:06:03 crc kubenswrapper[4685]: I0321 04:06:03.492702 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7f6w9\" (UniqueName: \"kubernetes.io/projected/62f3656b-1837-4b86-a106-923137d04fcf-kube-api-access-7f6w9\") on node \"crc\" DevicePath \"\"" Mar 21 04:06:04 crc kubenswrapper[4685]: I0321 04:06:04.150183 4685 generic.go:334] "Generic (PLEG): container finished" podID="1a9bdf2c-4aab-4f9a-9c39-30c961e3cd29" containerID="535ba5465e6117f6b479f79db72bec5a9b06c354ca9ed94038324a9bcc9b8a4e" exitCode=0 Mar 21 04:06:04 crc kubenswrapper[4685]: I0321 04:06:04.150274 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican1eeb-account-delete-lckg8" event={"ID":"1a9bdf2c-4aab-4f9a-9c39-30c961e3cd29","Type":"ContainerDied","Data":"535ba5465e6117f6b479f79db72bec5a9b06c354ca9ed94038324a9bcc9b8a4e"} Mar 21 04:06:04 crc kubenswrapper[4685]: I0321 04:06:04.154657 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-566574dc7b-97wnd" event={"ID":"62f3656b-1837-4b86-a106-923137d04fcf","Type":"ContainerDied","Data":"99000af8a82f257571786a79f79231ac50b81f8ce1f0323aac62ccf858ff1a6d"} Mar 21 04:06:04 crc kubenswrapper[4685]: I0321 04:06:04.154697 4685 scope.go:117] "RemoveContainer" containerID="81a3c54be7bf2f00c15c57402ff5a31d220690fe6a9989ad300ffd170fb8d001" Mar 21 04:06:04 crc kubenswrapper[4685]: I0321 04:06:04.154860 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-keystone-listener-566574dc7b-97wnd" Mar 21 04:06:04 crc kubenswrapper[4685]: I0321 04:06:04.189906 4685 scope.go:117] "RemoveContainer" containerID="229530ae4cfdea08483f46a407d1f32e4c9cc897d4da146322b58ee6cc87b2c4" Mar 21 04:06:04 crc kubenswrapper[4685]: I0321 04:06:04.190900 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-566574dc7b-97wnd"] Mar 21 04:06:04 crc kubenswrapper[4685]: I0321 04:06:04.197730 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-566574dc7b-97wnd"] Mar 21 04:06:04 crc kubenswrapper[4685]: I0321 04:06:04.313617 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4cee7f36-cab3-42c7-a2d2-f3ffc81d85cd" path="/var/lib/kubelet/pods/4cee7f36-cab3-42c7-a2d2-f3ffc81d85cd/volumes" Mar 21 04:06:04 crc kubenswrapper[4685]: I0321 04:06:04.315160 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53cb4c49-8904-4b10-a275-2cbe3ca4257a" path="/var/lib/kubelet/pods/53cb4c49-8904-4b10-a275-2cbe3ca4257a/volumes" Mar 21 04:06:04 crc kubenswrapper[4685]: I0321 04:06:04.316042 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62f3656b-1837-4b86-a106-923137d04fcf" path="/var/lib/kubelet/pods/62f3656b-1837-4b86-a106-923137d04fcf/volumes" Mar 21 04:06:04 crc kubenswrapper[4685]: I0321 04:06:04.317288 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b2406cc-b010-443b-89e6-2dc27034a38e" path="/var/lib/kubelet/pods/6b2406cc-b010-443b-89e6-2dc27034a38e/volumes" Mar 21 04:06:04 crc kubenswrapper[4685]: I0321 04:06:04.424402 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567766-lgcgr" Mar 21 04:06:04 crc kubenswrapper[4685]: I0321 04:06:04.607308 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jdbsj\" (UniqueName: \"kubernetes.io/projected/7bbb4a07-790d-41ce-a9d0-40dd729bba6e-kube-api-access-jdbsj\") pod \"7bbb4a07-790d-41ce-a9d0-40dd729bba6e\" (UID: \"7bbb4a07-790d-41ce-a9d0-40dd729bba6e\") " Mar 21 04:06:04 crc kubenswrapper[4685]: I0321 04:06:04.613552 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bbb4a07-790d-41ce-a9d0-40dd729bba6e-kube-api-access-jdbsj" (OuterVolumeSpecName: "kube-api-access-jdbsj") pod "7bbb4a07-790d-41ce-a9d0-40dd729bba6e" (UID: "7bbb4a07-790d-41ce-a9d0-40dd729bba6e"). InnerVolumeSpecName "kube-api-access-jdbsj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:06:04 crc kubenswrapper[4685]: I0321 04:06:04.709429 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jdbsj\" (UniqueName: \"kubernetes.io/projected/7bbb4a07-790d-41ce-a9d0-40dd729bba6e-kube-api-access-jdbsj\") on node \"crc\" DevicePath \"\"" Mar 21 04:06:05 crc kubenswrapper[4685]: I0321 04:06:05.170632 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567766-lgcgr" event={"ID":"7bbb4a07-790d-41ce-a9d0-40dd729bba6e","Type":"ContainerDied","Data":"aba37e6f136d88c61cb83b698a378c784e456c5b4a0aa040399e365460f4a028"} Mar 21 04:06:05 crc kubenswrapper[4685]: I0321 04:06:05.171123 4685 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aba37e6f136d88c61cb83b698a378c784e456c5b4a0aa040399e365460f4a028" Mar 21 04:06:05 crc kubenswrapper[4685]: I0321 04:06:05.170692 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567766-lgcgr" Mar 21 04:06:05 crc kubenswrapper[4685]: I0321 04:06:05.446131 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican1eeb-account-delete-lckg8" Mar 21 04:06:05 crc kubenswrapper[4685]: I0321 04:06:05.477360 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567760-xpknn"] Mar 21 04:06:05 crc kubenswrapper[4685]: I0321 04:06:05.485683 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567760-xpknn"] Mar 21 04:06:05 crc kubenswrapper[4685]: I0321 04:06:05.620913 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1a9bdf2c-4aab-4f9a-9c39-30c961e3cd29-operator-scripts\") pod \"1a9bdf2c-4aab-4f9a-9c39-30c961e3cd29\" (UID: \"1a9bdf2c-4aab-4f9a-9c39-30c961e3cd29\") " Mar 21 04:06:05 crc kubenswrapper[4685]: I0321 04:06:05.620979 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4n82d\" (UniqueName: \"kubernetes.io/projected/1a9bdf2c-4aab-4f9a-9c39-30c961e3cd29-kube-api-access-4n82d\") pod \"1a9bdf2c-4aab-4f9a-9c39-30c961e3cd29\" (UID: \"1a9bdf2c-4aab-4f9a-9c39-30c961e3cd29\") " Mar 21 04:06:05 crc kubenswrapper[4685]: I0321 04:06:05.621860 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a9bdf2c-4aab-4f9a-9c39-30c961e3cd29-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1a9bdf2c-4aab-4f9a-9c39-30c961e3cd29" (UID: "1a9bdf2c-4aab-4f9a-9c39-30c961e3cd29"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:06:05 crc kubenswrapper[4685]: I0321 04:06:05.628996 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a9bdf2c-4aab-4f9a-9c39-30c961e3cd29-kube-api-access-4n82d" (OuterVolumeSpecName: "kube-api-access-4n82d") pod "1a9bdf2c-4aab-4f9a-9c39-30c961e3cd29" (UID: "1a9bdf2c-4aab-4f9a-9c39-30c961e3cd29"). InnerVolumeSpecName "kube-api-access-4n82d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:06:05 crc kubenswrapper[4685]: I0321 04:06:05.722585 4685 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1a9bdf2c-4aab-4f9a-9c39-30c961e3cd29-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:06:05 crc kubenswrapper[4685]: I0321 04:06:05.722622 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4n82d\" (UniqueName: \"kubernetes.io/projected/1a9bdf2c-4aab-4f9a-9c39-30c961e3cd29-kube-api-access-4n82d\") on node \"crc\" DevicePath \"\"" Mar 21 04:06:06 crc kubenswrapper[4685]: I0321 04:06:06.186121 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican1eeb-account-delete-lckg8" event={"ID":"1a9bdf2c-4aab-4f9a-9c39-30c961e3cd29","Type":"ContainerDied","Data":"aed1e2736fc137709f6d9870dc7c90c077e736b571c337b96bf084e730ef4ea2"} Mar 21 04:06:06 crc kubenswrapper[4685]: I0321 04:06:06.186196 4685 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aed1e2736fc137709f6d9870dc7c90c077e736b571c337b96bf084e730ef4ea2" Mar 21 04:06:06 crc kubenswrapper[4685]: I0321 04:06:06.186151 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican1eeb-account-delete-lckg8" Mar 21 04:06:06 crc kubenswrapper[4685]: I0321 04:06:06.321502 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f3ae43e-162a-41fc-85f0-92106386bea7" path="/var/lib/kubelet/pods/5f3ae43e-162a-41fc-85f0-92106386bea7/volumes" Mar 21 04:06:07 crc kubenswrapper[4685]: I0321 04:06:07.272531 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-db-create-f7bbt"] Mar 21 04:06:07 crc kubenswrapper[4685]: I0321 04:06:07.281061 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/barbican-db-create-f7bbt"] Mar 21 04:06:07 crc kubenswrapper[4685]: I0321 04:06:07.287745 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican1eeb-account-delete-lckg8"] Mar 21 04:06:07 crc kubenswrapper[4685]: I0321 04:06:07.294068 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-1eeb-account-create-update-p8r2p"] Mar 21 04:06:07 crc kubenswrapper[4685]: I0321 04:06:07.300178 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/barbican1eeb-account-delete-lckg8"] Mar 21 04:06:07 crc kubenswrapper[4685]: I0321 04:06:07.305143 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/barbican-1eeb-account-create-update-p8r2p"] Mar 21 04:06:07 crc kubenswrapper[4685]: I0321 04:06:07.356972 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/barbican-db-create-z4q5w"] Mar 21 04:06:07 crc kubenswrapper[4685]: E0321 04:06:07.357727 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62f3656b-1837-4b86-a106-923137d04fcf" containerName="barbican-keystone-listener" Mar 21 04:06:07 crc kubenswrapper[4685]: I0321 04:06:07.357753 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="62f3656b-1837-4b86-a106-923137d04fcf" containerName="barbican-keystone-listener" Mar 21 04:06:07 crc kubenswrapper[4685]: E0321 04:06:07.357772 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b2406cc-b010-443b-89e6-2dc27034a38e" containerName="barbican-worker" Mar 21 04:06:07 crc kubenswrapper[4685]: I0321 04:06:07.357783 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b2406cc-b010-443b-89e6-2dc27034a38e" containerName="barbican-worker" Mar 21 04:06:07 crc kubenswrapper[4685]: E0321 04:06:07.357807 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a9bdf2c-4aab-4f9a-9c39-30c961e3cd29" containerName="mariadb-account-delete" Mar 21 04:06:07 crc kubenswrapper[4685]: I0321 04:06:07.357820 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a9bdf2c-4aab-4f9a-9c39-30c961e3cd29" containerName="mariadb-account-delete" Mar 21 04:06:07 crc kubenswrapper[4685]: E0321 04:06:07.357879 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bbb4a07-790d-41ce-a9d0-40dd729bba6e" containerName="oc" Mar 21 04:06:07 crc kubenswrapper[4685]: I0321 04:06:07.357892 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bbb4a07-790d-41ce-a9d0-40dd729bba6e" containerName="oc" Mar 21 04:06:07 crc kubenswrapper[4685]: E0321 04:06:07.357927 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62f3656b-1837-4b86-a106-923137d04fcf" containerName="barbican-keystone-listener-log" Mar 21 04:06:07 crc kubenswrapper[4685]: I0321 04:06:07.357945 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="62f3656b-1837-4b86-a106-923137d04fcf" containerName="barbican-keystone-listener-log" Mar 21 04:06:07 crc kubenswrapper[4685]: E0321 04:06:07.357972 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b2406cc-b010-443b-89e6-2dc27034a38e" containerName="barbican-worker-log" Mar 21 04:06:07 crc kubenswrapper[4685]: I0321 04:06:07.357981 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b2406cc-b010-443b-89e6-2dc27034a38e" containerName="barbican-worker-log" Mar 21 04:06:07 crc kubenswrapper[4685]: I0321 04:06:07.358296 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b2406cc-b010-443b-89e6-2dc27034a38e" containerName="barbican-worker-log" Mar 21 04:06:07 crc kubenswrapper[4685]: I0321 04:06:07.358324 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="62f3656b-1837-4b86-a106-923137d04fcf" containerName="barbican-keystone-listener-log" Mar 21 04:06:07 crc kubenswrapper[4685]: I0321 04:06:07.358348 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b2406cc-b010-443b-89e6-2dc27034a38e" containerName="barbican-worker" Mar 21 04:06:07 crc kubenswrapper[4685]: I0321 04:06:07.358365 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="62f3656b-1837-4b86-a106-923137d04fcf" containerName="barbican-keystone-listener" Mar 21 04:06:07 crc kubenswrapper[4685]: I0321 04:06:07.358386 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a9bdf2c-4aab-4f9a-9c39-30c961e3cd29" containerName="mariadb-account-delete" Mar 21 04:06:07 crc kubenswrapper[4685]: I0321 04:06:07.358410 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bbb4a07-790d-41ce-a9d0-40dd729bba6e" containerName="oc" Mar 21 04:06:07 crc kubenswrapper[4685]: I0321 04:06:07.359214 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-db-create-z4q5w" Mar 21 04:06:07 crc kubenswrapper[4685]: I0321 04:06:07.383250 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-db-create-z4q5w"] Mar 21 04:06:07 crc kubenswrapper[4685]: I0321 04:06:07.467146 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/barbican-d8e1-account-create-update-hrpbv"] Mar 21 04:06:07 crc kubenswrapper[4685]: I0321 04:06:07.469242 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-d8e1-account-create-update-hrpbv" Mar 21 04:06:07 crc kubenswrapper[4685]: I0321 04:06:07.474169 4685 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"barbican-db-secret" Mar 21 04:06:07 crc kubenswrapper[4685]: I0321 04:06:07.474370 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-d8e1-account-create-update-hrpbv"] Mar 21 04:06:07 crc kubenswrapper[4685]: I0321 04:06:07.481116 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hdg4\" (UniqueName: \"kubernetes.io/projected/4048b1a3-3a1f-40c7-be26-3f917fb8b0bd-kube-api-access-2hdg4\") pod \"barbican-db-create-z4q5w\" (UID: \"4048b1a3-3a1f-40c7-be26-3f917fb8b0bd\") " pod="barbican-kuttl-tests/barbican-db-create-z4q5w" Mar 21 04:06:07 crc kubenswrapper[4685]: I0321 04:06:07.481181 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4048b1a3-3a1f-40c7-be26-3f917fb8b0bd-operator-scripts\") pod \"barbican-db-create-z4q5w\" (UID: \"4048b1a3-3a1f-40c7-be26-3f917fb8b0bd\") " pod="barbican-kuttl-tests/barbican-db-create-z4q5w" Mar 21 04:06:07 crc kubenswrapper[4685]: I0321 04:06:07.582793 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbzr2\" (UniqueName: \"kubernetes.io/projected/68904f9b-5534-4133-92d8-37d36c09d27d-kube-api-access-sbzr2\") pod \"barbican-d8e1-account-create-update-hrpbv\" (UID: \"68904f9b-5534-4133-92d8-37d36c09d27d\") " pod="barbican-kuttl-tests/barbican-d8e1-account-create-update-hrpbv" Mar 21 04:06:07 crc kubenswrapper[4685]: I0321 04:06:07.582859 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hdg4\" (UniqueName: \"kubernetes.io/projected/4048b1a3-3a1f-40c7-be26-3f917fb8b0bd-kube-api-access-2hdg4\") pod \"barbican-db-create-z4q5w\" (UID: \"4048b1a3-3a1f-40c7-be26-3f917fb8b0bd\") " pod="barbican-kuttl-tests/barbican-db-create-z4q5w" Mar 21 04:06:07 crc kubenswrapper[4685]: I0321 04:06:07.582909 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68904f9b-5534-4133-92d8-37d36c09d27d-operator-scripts\") pod \"barbican-d8e1-account-create-update-hrpbv\" (UID: \"68904f9b-5534-4133-92d8-37d36c09d27d\") " pod="barbican-kuttl-tests/barbican-d8e1-account-create-update-hrpbv" Mar 21 04:06:07 crc kubenswrapper[4685]: I0321 04:06:07.582936 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4048b1a3-3a1f-40c7-be26-3f917fb8b0bd-operator-scripts\") pod \"barbican-db-create-z4q5w\" (UID: \"4048b1a3-3a1f-40c7-be26-3f917fb8b0bd\") " pod="barbican-kuttl-tests/barbican-db-create-z4q5w" Mar 21 04:06:07 crc kubenswrapper[4685]: I0321 04:06:07.583972 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4048b1a3-3a1f-40c7-be26-3f917fb8b0bd-operator-scripts\") pod \"barbican-db-create-z4q5w\" (UID: \"4048b1a3-3a1f-40c7-be26-3f917fb8b0bd\") " pod="barbican-kuttl-tests/barbican-db-create-z4q5w" Mar 21 04:06:07 crc kubenswrapper[4685]: I0321 04:06:07.604131 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hdg4\" (UniqueName: \"kubernetes.io/projected/4048b1a3-3a1f-40c7-be26-3f917fb8b0bd-kube-api-access-2hdg4\") pod \"barbican-db-create-z4q5w\" (UID: \"4048b1a3-3a1f-40c7-be26-3f917fb8b0bd\") " pod="barbican-kuttl-tests/barbican-db-create-z4q5w" Mar 21 04:06:07 crc kubenswrapper[4685]: I0321 04:06:07.684444 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbzr2\" (UniqueName: \"kubernetes.io/projected/68904f9b-5534-4133-92d8-37d36c09d27d-kube-api-access-sbzr2\") pod \"barbican-d8e1-account-create-update-hrpbv\" (UID: \"68904f9b-5534-4133-92d8-37d36c09d27d\") " pod="barbican-kuttl-tests/barbican-d8e1-account-create-update-hrpbv" Mar 21 04:06:07 crc kubenswrapper[4685]: I0321 04:06:07.684580 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68904f9b-5534-4133-92d8-37d36c09d27d-operator-scripts\") pod \"barbican-d8e1-account-create-update-hrpbv\" (UID: \"68904f9b-5534-4133-92d8-37d36c09d27d\") " pod="barbican-kuttl-tests/barbican-d8e1-account-create-update-hrpbv" Mar 21 04:06:07 crc kubenswrapper[4685]: I0321 04:06:07.685888 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68904f9b-5534-4133-92d8-37d36c09d27d-operator-scripts\") pod \"barbican-d8e1-account-create-update-hrpbv\" (UID: \"68904f9b-5534-4133-92d8-37d36c09d27d\") " pod="barbican-kuttl-tests/barbican-d8e1-account-create-update-hrpbv" Mar 21 04:06:07 crc kubenswrapper[4685]: I0321 04:06:07.687388 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-db-create-z4q5w" Mar 21 04:06:07 crc kubenswrapper[4685]: I0321 04:06:07.705183 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbzr2\" (UniqueName: \"kubernetes.io/projected/68904f9b-5534-4133-92d8-37d36c09d27d-kube-api-access-sbzr2\") pod \"barbican-d8e1-account-create-update-hrpbv\" (UID: \"68904f9b-5534-4133-92d8-37d36c09d27d\") " pod="barbican-kuttl-tests/barbican-d8e1-account-create-update-hrpbv" Mar 21 04:06:07 crc kubenswrapper[4685]: I0321 04:06:07.787225 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-d8e1-account-create-update-hrpbv" Mar 21 04:06:08 crc kubenswrapper[4685]: I0321 04:06:08.000244 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-db-create-z4q5w"] Mar 21 04:06:08 crc kubenswrapper[4685]: I0321 04:06:08.200403 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-db-create-z4q5w" event={"ID":"4048b1a3-3a1f-40c7-be26-3f917fb8b0bd","Type":"ContainerStarted","Data":"5f25775fce3062c5bdbcd6163f3c973c31355f39195d711af6478b31a9bd4ba6"} Mar 21 04:06:08 crc kubenswrapper[4685]: I0321 04:06:08.292121 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-d8e1-account-create-update-hrpbv"] Mar 21 04:06:08 crc kubenswrapper[4685]: I0321 04:06:08.314948 4685 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"barbican-db-secret" Mar 21 04:06:08 crc kubenswrapper[4685]: I0321 04:06:08.318263 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a9bdf2c-4aab-4f9a-9c39-30c961e3cd29" path="/var/lib/kubelet/pods/1a9bdf2c-4aab-4f9a-9c39-30c961e3cd29/volumes" Mar 21 04:06:08 crc kubenswrapper[4685]: I0321 04:06:08.319275 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70c7121a-c5b2-4cf2-83bf-208ca926520a" path="/var/lib/kubelet/pods/70c7121a-c5b2-4cf2-83bf-208ca926520a/volumes" Mar 21 04:06:08 crc kubenswrapper[4685]: I0321 04:06:08.320480 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c914374a-7747-4f27-aeee-b8a64331159a" path="/var/lib/kubelet/pods/c914374a-7747-4f27-aeee-b8a64331159a/volumes" Mar 21 04:06:09 crc kubenswrapper[4685]: I0321 04:06:09.208735 4685 generic.go:334] "Generic (PLEG): container finished" podID="4048b1a3-3a1f-40c7-be26-3f917fb8b0bd" containerID="44e1b45e9ba3a97b24d7c15aabb381845fd30deefbd2c562aa820ba014672ccb" exitCode=0 Mar 21 04:06:09 crc kubenswrapper[4685]: I0321 04:06:09.208817 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-db-create-z4q5w" event={"ID":"4048b1a3-3a1f-40c7-be26-3f917fb8b0bd","Type":"ContainerDied","Data":"44e1b45e9ba3a97b24d7c15aabb381845fd30deefbd2c562aa820ba014672ccb"} Mar 21 04:06:09 crc kubenswrapper[4685]: I0321 04:06:09.211515 4685 generic.go:334] "Generic (PLEG): container finished" podID="68904f9b-5534-4133-92d8-37d36c09d27d" containerID="0492d6fb42b0838ad25d70798ee623971214615668da21d10d7464027769c5a0" exitCode=0 Mar 21 04:06:09 crc kubenswrapper[4685]: I0321 04:06:09.211548 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-d8e1-account-create-update-hrpbv" event={"ID":"68904f9b-5534-4133-92d8-37d36c09d27d","Type":"ContainerDied","Data":"0492d6fb42b0838ad25d70798ee623971214615668da21d10d7464027769c5a0"} Mar 21 04:06:09 crc kubenswrapper[4685]: I0321 04:06:09.211566 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-d8e1-account-create-update-hrpbv" event={"ID":"68904f9b-5534-4133-92d8-37d36c09d27d","Type":"ContainerStarted","Data":"d9c2043fb93258c20e2a4a4363a39c9cc3860f68cae57edb0203a7960c762daf"} Mar 21 04:06:09 crc kubenswrapper[4685]: I0321 04:06:09.685568 4685 patch_prober.go:28] interesting pod/machine-config-daemon-7r9cg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 04:06:09 crc kubenswrapper[4685]: I0321 04:06:09.685632 4685 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" podUID="cea46fe2-4e41-43ab-a069-cb30fb4e732c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 04:06:09 crc kubenswrapper[4685]: I0321 04:06:09.685681 4685 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" Mar 21 04:06:09 crc kubenswrapper[4685]: I0321 04:06:09.686477 4685 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fc184a6e763e19dcb85e7464f3adc7dbb9e9291d839d9e13c38f6aed20771d12"} pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 21 04:06:09 crc kubenswrapper[4685]: I0321 04:06:09.686538 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" podUID="cea46fe2-4e41-43ab-a069-cb30fb4e732c" containerName="machine-config-daemon" containerID="cri-o://fc184a6e763e19dcb85e7464f3adc7dbb9e9291d839d9e13c38f6aed20771d12" gracePeriod=600 Mar 21 04:06:10 crc kubenswrapper[4685]: I0321 04:06:10.222200 4685 generic.go:334] "Generic (PLEG): container finished" podID="cea46fe2-4e41-43ab-a069-cb30fb4e732c" containerID="fc184a6e763e19dcb85e7464f3adc7dbb9e9291d839d9e13c38f6aed20771d12" exitCode=0 Mar 21 04:06:10 crc kubenswrapper[4685]: I0321 04:06:10.222279 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" event={"ID":"cea46fe2-4e41-43ab-a069-cb30fb4e732c","Type":"ContainerDied","Data":"fc184a6e763e19dcb85e7464f3adc7dbb9e9291d839d9e13c38f6aed20771d12"} Mar 21 04:06:10 crc kubenswrapper[4685]: I0321 04:06:10.222955 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" event={"ID":"cea46fe2-4e41-43ab-a069-cb30fb4e732c","Type":"ContainerStarted","Data":"855883c827cd8a38d55f90b9086e8832f325f74b077f5a71a8a2d2ad0a467f7f"} Mar 21 04:06:10 crc kubenswrapper[4685]: I0321 04:06:10.222987 4685 scope.go:117] "RemoveContainer" containerID="ac4ffd676ad57605265aed5caa44cae8130cfde3468685b94b3265e3fc4a39a0" Mar 21 04:06:10 crc kubenswrapper[4685]: I0321 04:06:10.568966 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-db-create-z4q5w" Mar 21 04:06:10 crc kubenswrapper[4685]: I0321 04:06:10.577045 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-d8e1-account-create-update-hrpbv" Mar 21 04:06:10 crc kubenswrapper[4685]: I0321 04:06:10.630628 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68904f9b-5534-4133-92d8-37d36c09d27d-operator-scripts\") pod \"68904f9b-5534-4133-92d8-37d36c09d27d\" (UID: \"68904f9b-5534-4133-92d8-37d36c09d27d\") " Mar 21 04:06:10 crc kubenswrapper[4685]: I0321 04:06:10.630735 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2hdg4\" (UniqueName: \"kubernetes.io/projected/4048b1a3-3a1f-40c7-be26-3f917fb8b0bd-kube-api-access-2hdg4\") pod \"4048b1a3-3a1f-40c7-be26-3f917fb8b0bd\" (UID: \"4048b1a3-3a1f-40c7-be26-3f917fb8b0bd\") " Mar 21 04:06:10 crc kubenswrapper[4685]: I0321 04:06:10.630775 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4048b1a3-3a1f-40c7-be26-3f917fb8b0bd-operator-scripts\") pod \"4048b1a3-3a1f-40c7-be26-3f917fb8b0bd\" (UID: \"4048b1a3-3a1f-40c7-be26-3f917fb8b0bd\") " Mar 21 04:06:10 crc kubenswrapper[4685]: I0321 04:06:10.630803 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbzr2\" (UniqueName: \"kubernetes.io/projected/68904f9b-5534-4133-92d8-37d36c09d27d-kube-api-access-sbzr2\") pod \"68904f9b-5534-4133-92d8-37d36c09d27d\" (UID: \"68904f9b-5534-4133-92d8-37d36c09d27d\") " Mar 21 04:06:10 crc kubenswrapper[4685]: I0321 04:06:10.631715 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68904f9b-5534-4133-92d8-37d36c09d27d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "68904f9b-5534-4133-92d8-37d36c09d27d" (UID: "68904f9b-5534-4133-92d8-37d36c09d27d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:06:10 crc kubenswrapper[4685]: I0321 04:06:10.631783 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4048b1a3-3a1f-40c7-be26-3f917fb8b0bd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4048b1a3-3a1f-40c7-be26-3f917fb8b0bd" (UID: "4048b1a3-3a1f-40c7-be26-3f917fb8b0bd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:06:10 crc kubenswrapper[4685]: I0321 04:06:10.640058 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68904f9b-5534-4133-92d8-37d36c09d27d-kube-api-access-sbzr2" (OuterVolumeSpecName: "kube-api-access-sbzr2") pod "68904f9b-5534-4133-92d8-37d36c09d27d" (UID: "68904f9b-5534-4133-92d8-37d36c09d27d"). InnerVolumeSpecName "kube-api-access-sbzr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:06:10 crc kubenswrapper[4685]: I0321 04:06:10.640196 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4048b1a3-3a1f-40c7-be26-3f917fb8b0bd-kube-api-access-2hdg4" (OuterVolumeSpecName: "kube-api-access-2hdg4") pod "4048b1a3-3a1f-40c7-be26-3f917fb8b0bd" (UID: "4048b1a3-3a1f-40c7-be26-3f917fb8b0bd"). InnerVolumeSpecName "kube-api-access-2hdg4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:06:10 crc kubenswrapper[4685]: I0321 04:06:10.732781 4685 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4048b1a3-3a1f-40c7-be26-3f917fb8b0bd-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:06:10 crc kubenswrapper[4685]: I0321 04:06:10.732867 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbzr2\" (UniqueName: \"kubernetes.io/projected/68904f9b-5534-4133-92d8-37d36c09d27d-kube-api-access-sbzr2\") on node \"crc\" DevicePath \"\"" Mar 21 04:06:10 crc kubenswrapper[4685]: I0321 04:06:10.732913 4685 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68904f9b-5534-4133-92d8-37d36c09d27d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:06:10 crc kubenswrapper[4685]: I0321 04:06:10.733020 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2hdg4\" (UniqueName: \"kubernetes.io/projected/4048b1a3-3a1f-40c7-be26-3f917fb8b0bd-kube-api-access-2hdg4\") on node \"crc\" DevicePath \"\"" Mar 21 04:06:11 crc kubenswrapper[4685]: I0321 04:06:11.233810 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-db-create-z4q5w" event={"ID":"4048b1a3-3a1f-40c7-be26-3f917fb8b0bd","Type":"ContainerDied","Data":"5f25775fce3062c5bdbcd6163f3c973c31355f39195d711af6478b31a9bd4ba6"} Mar 21 04:06:11 crc kubenswrapper[4685]: I0321 04:06:11.233916 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-db-create-z4q5w" Mar 21 04:06:11 crc kubenswrapper[4685]: I0321 04:06:11.233952 4685 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f25775fce3062c5bdbcd6163f3c973c31355f39195d711af6478b31a9bd4ba6" Mar 21 04:06:11 crc kubenswrapper[4685]: I0321 04:06:11.238689 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-d8e1-account-create-update-hrpbv" event={"ID":"68904f9b-5534-4133-92d8-37d36c09d27d","Type":"ContainerDied","Data":"d9c2043fb93258c20e2a4a4363a39c9cc3860f68cae57edb0203a7960c762daf"} Mar 21 04:06:11 crc kubenswrapper[4685]: I0321 04:06:11.238719 4685 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d9c2043fb93258c20e2a4a4363a39c9cc3860f68cae57edb0203a7960c762daf" Mar 21 04:06:11 crc kubenswrapper[4685]: I0321 04:06:11.238747 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-d8e1-account-create-update-hrpbv" Mar 21 04:06:12 crc kubenswrapper[4685]: I0321 04:06:12.721702 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/barbican-db-sync-f8frz"] Mar 21 04:06:12 crc kubenswrapper[4685]: E0321 04:06:12.722437 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4048b1a3-3a1f-40c7-be26-3f917fb8b0bd" containerName="mariadb-database-create" Mar 21 04:06:12 crc kubenswrapper[4685]: I0321 04:06:12.722449 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="4048b1a3-3a1f-40c7-be26-3f917fb8b0bd" containerName="mariadb-database-create" Mar 21 04:06:12 crc kubenswrapper[4685]: E0321 04:06:12.722460 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68904f9b-5534-4133-92d8-37d36c09d27d" containerName="mariadb-account-create-update" Mar 21 04:06:12 crc kubenswrapper[4685]: I0321 04:06:12.722467 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="68904f9b-5534-4133-92d8-37d36c09d27d" containerName="mariadb-account-create-update" Mar 21 04:06:12 crc kubenswrapper[4685]: I0321 04:06:12.722572 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="68904f9b-5534-4133-92d8-37d36c09d27d" containerName="mariadb-account-create-update" Mar 21 04:06:12 crc kubenswrapper[4685]: I0321 04:06:12.722584 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="4048b1a3-3a1f-40c7-be26-3f917fb8b0bd" containerName="mariadb-database-create" Mar 21 04:06:12 crc kubenswrapper[4685]: I0321 04:06:12.723034 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-db-sync-f8frz" Mar 21 04:06:12 crc kubenswrapper[4685]: I0321 04:06:12.725019 4685 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"combined-ca-bundle" Mar 21 04:06:12 crc kubenswrapper[4685]: I0321 04:06:12.725203 4685 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"barbican-barbican-dockercfg-tbzlh" Mar 21 04:06:12 crc kubenswrapper[4685]: I0321 04:06:12.767198 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f40812e-2d66-4445-9549-e1431ba3de71-combined-ca-bundle\") pod \"barbican-db-sync-f8frz\" (UID: \"3f40812e-2d66-4445-9549-e1431ba3de71\") " pod="barbican-kuttl-tests/barbican-db-sync-f8frz" Mar 21 04:06:12 crc kubenswrapper[4685]: I0321 04:06:12.767249 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3f40812e-2d66-4445-9549-e1431ba3de71-db-sync-config-data\") pod \"barbican-db-sync-f8frz\" (UID: \"3f40812e-2d66-4445-9549-e1431ba3de71\") " pod="barbican-kuttl-tests/barbican-db-sync-f8frz" Mar 21 04:06:12 crc kubenswrapper[4685]: I0321 04:06:12.767322 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7srr\" (UniqueName: \"kubernetes.io/projected/3f40812e-2d66-4445-9549-e1431ba3de71-kube-api-access-g7srr\") pod \"barbican-db-sync-f8frz\" (UID: \"3f40812e-2d66-4445-9549-e1431ba3de71\") " pod="barbican-kuttl-tests/barbican-db-sync-f8frz" Mar 21 04:06:12 crc kubenswrapper[4685]: I0321 04:06:12.843968 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-db-sync-f8frz"] Mar 21 04:06:12 crc kubenswrapper[4685]: I0321 04:06:12.868756 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7srr\" (UniqueName: \"kubernetes.io/projected/3f40812e-2d66-4445-9549-e1431ba3de71-kube-api-access-g7srr\") pod \"barbican-db-sync-f8frz\" (UID: \"3f40812e-2d66-4445-9549-e1431ba3de71\") " pod="barbican-kuttl-tests/barbican-db-sync-f8frz" Mar 21 04:06:12 crc kubenswrapper[4685]: I0321 04:06:12.868904 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f40812e-2d66-4445-9549-e1431ba3de71-combined-ca-bundle\") pod \"barbican-db-sync-f8frz\" (UID: \"3f40812e-2d66-4445-9549-e1431ba3de71\") " pod="barbican-kuttl-tests/barbican-db-sync-f8frz" Mar 21 04:06:12 crc kubenswrapper[4685]: I0321 04:06:12.868961 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3f40812e-2d66-4445-9549-e1431ba3de71-db-sync-config-data\") pod \"barbican-db-sync-f8frz\" (UID: \"3f40812e-2d66-4445-9549-e1431ba3de71\") " pod="barbican-kuttl-tests/barbican-db-sync-f8frz" Mar 21 04:06:12 crc kubenswrapper[4685]: I0321 04:06:12.874615 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f40812e-2d66-4445-9549-e1431ba3de71-combined-ca-bundle\") pod \"barbican-db-sync-f8frz\" (UID: \"3f40812e-2d66-4445-9549-e1431ba3de71\") " pod="barbican-kuttl-tests/barbican-db-sync-f8frz" Mar 21 04:06:12 crc kubenswrapper[4685]: I0321 04:06:12.887990 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3f40812e-2d66-4445-9549-e1431ba3de71-db-sync-config-data\") pod \"barbican-db-sync-f8frz\" (UID: \"3f40812e-2d66-4445-9549-e1431ba3de71\") " pod="barbican-kuttl-tests/barbican-db-sync-f8frz" Mar 21 04:06:12 crc kubenswrapper[4685]: I0321 04:06:12.888617 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7srr\" (UniqueName: \"kubernetes.io/projected/3f40812e-2d66-4445-9549-e1431ba3de71-kube-api-access-g7srr\") pod \"barbican-db-sync-f8frz\" (UID: \"3f40812e-2d66-4445-9549-e1431ba3de71\") " pod="barbican-kuttl-tests/barbican-db-sync-f8frz" Mar 21 04:06:13 crc kubenswrapper[4685]: I0321 04:06:13.037544 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-db-sync-f8frz" Mar 21 04:06:13 crc kubenswrapper[4685]: I0321 04:06:13.463089 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-db-sync-f8frz"] Mar 21 04:06:14 crc kubenswrapper[4685]: I0321 04:06:14.261330 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-db-sync-f8frz" event={"ID":"3f40812e-2d66-4445-9549-e1431ba3de71","Type":"ContainerStarted","Data":"ca6361bdc8f807ade02a12c7a69bc2039fd0386bea3e6cc678936a1e5479fdf0"} Mar 21 04:06:14 crc kubenswrapper[4685]: I0321 04:06:14.261646 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-db-sync-f8frz" event={"ID":"3f40812e-2d66-4445-9549-e1431ba3de71","Type":"ContainerStarted","Data":"20ed1a95fc9d6594bd6032d42b6518237e35082c1116c47a28d1770ab76ece85"} Mar 21 04:06:14 crc kubenswrapper[4685]: I0321 04:06:14.292736 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/barbican-db-sync-f8frz" podStartSLOduration=2.292712691 podStartE2EDuration="2.292712691s" podCreationTimestamp="2026-03-21 04:06:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:06:14.288095032 +0000 UTC m=+1206.765163824" watchObservedRunningTime="2026-03-21 04:06:14.292712691 +0000 UTC m=+1206.769781483" Mar 21 04:06:15 crc kubenswrapper[4685]: I0321 04:06:15.271513 4685 generic.go:334] "Generic (PLEG): container finished" podID="3f40812e-2d66-4445-9549-e1431ba3de71" containerID="ca6361bdc8f807ade02a12c7a69bc2039fd0386bea3e6cc678936a1e5479fdf0" exitCode=0 Mar 21 04:06:15 crc kubenswrapper[4685]: I0321 04:06:15.271553 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-db-sync-f8frz" event={"ID":"3f40812e-2d66-4445-9549-e1431ba3de71","Type":"ContainerDied","Data":"ca6361bdc8f807ade02a12c7a69bc2039fd0386bea3e6cc678936a1e5479fdf0"} Mar 21 04:06:16 crc kubenswrapper[4685]: I0321 04:06:16.575254 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-db-sync-f8frz" Mar 21 04:06:16 crc kubenswrapper[4685]: I0321 04:06:16.621830 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f40812e-2d66-4445-9549-e1431ba3de71-combined-ca-bundle\") pod \"3f40812e-2d66-4445-9549-e1431ba3de71\" (UID: \"3f40812e-2d66-4445-9549-e1431ba3de71\") " Mar 21 04:06:16 crc kubenswrapper[4685]: I0321 04:06:16.621978 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g7srr\" (UniqueName: \"kubernetes.io/projected/3f40812e-2d66-4445-9549-e1431ba3de71-kube-api-access-g7srr\") pod \"3f40812e-2d66-4445-9549-e1431ba3de71\" (UID: \"3f40812e-2d66-4445-9549-e1431ba3de71\") " Mar 21 04:06:16 crc kubenswrapper[4685]: I0321 04:06:16.622158 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3f40812e-2d66-4445-9549-e1431ba3de71-db-sync-config-data\") pod \"3f40812e-2d66-4445-9549-e1431ba3de71\" (UID: \"3f40812e-2d66-4445-9549-e1431ba3de71\") " Mar 21 04:06:16 crc kubenswrapper[4685]: I0321 04:06:16.627427 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f40812e-2d66-4445-9549-e1431ba3de71-kube-api-access-g7srr" (OuterVolumeSpecName: "kube-api-access-g7srr") pod "3f40812e-2d66-4445-9549-e1431ba3de71" (UID: "3f40812e-2d66-4445-9549-e1431ba3de71"). InnerVolumeSpecName "kube-api-access-g7srr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:06:16 crc kubenswrapper[4685]: I0321 04:06:16.627494 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f40812e-2d66-4445-9549-e1431ba3de71-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "3f40812e-2d66-4445-9549-e1431ba3de71" (UID: "3f40812e-2d66-4445-9549-e1431ba3de71"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:06:16 crc kubenswrapper[4685]: I0321 04:06:16.646611 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f40812e-2d66-4445-9549-e1431ba3de71-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3f40812e-2d66-4445-9549-e1431ba3de71" (UID: "3f40812e-2d66-4445-9549-e1431ba3de71"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:06:16 crc kubenswrapper[4685]: I0321 04:06:16.723547 4685 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3f40812e-2d66-4445-9549-e1431ba3de71-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:06:16 crc kubenswrapper[4685]: I0321 04:06:16.723607 4685 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f40812e-2d66-4445-9549-e1431ba3de71-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:06:16 crc kubenswrapper[4685]: I0321 04:06:16.723616 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g7srr\" (UniqueName: \"kubernetes.io/projected/3f40812e-2d66-4445-9549-e1431ba3de71-kube-api-access-g7srr\") on node \"crc\" DevicePath \"\"" Mar 21 04:06:17 crc kubenswrapper[4685]: I0321 04:06:17.298117 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-db-sync-f8frz" event={"ID":"3f40812e-2d66-4445-9549-e1431ba3de71","Type":"ContainerDied","Data":"20ed1a95fc9d6594bd6032d42b6518237e35082c1116c47a28d1770ab76ece85"} Mar 21 04:06:17 crc kubenswrapper[4685]: I0321 04:06:17.300937 4685 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="20ed1a95fc9d6594bd6032d42b6518237e35082c1116c47a28d1770ab76ece85" Mar 21 04:06:17 crc kubenswrapper[4685]: I0321 04:06:17.298500 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-db-sync-f8frz" Mar 21 04:06:17 crc kubenswrapper[4685]: I0321 04:06:17.755057 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/barbican-worker-76848f96c5-bkdvl"] Mar 21 04:06:17 crc kubenswrapper[4685]: E0321 04:06:17.757141 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f40812e-2d66-4445-9549-e1431ba3de71" containerName="barbican-db-sync" Mar 21 04:06:17 crc kubenswrapper[4685]: I0321 04:06:17.757277 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f40812e-2d66-4445-9549-e1431ba3de71" containerName="barbican-db-sync" Mar 21 04:06:17 crc kubenswrapper[4685]: I0321 04:06:17.757614 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f40812e-2d66-4445-9549-e1431ba3de71" containerName="barbican-db-sync" Mar 21 04:06:17 crc kubenswrapper[4685]: I0321 04:06:17.758740 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-worker-76848f96c5-bkdvl" Mar 21 04:06:17 crc kubenswrapper[4685]: I0321 04:06:17.763294 4685 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"combined-ca-bundle" Mar 21 04:06:17 crc kubenswrapper[4685]: I0321 04:06:17.764324 4685 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"barbican-barbican-dockercfg-tbzlh" Mar 21 04:06:17 crc kubenswrapper[4685]: I0321 04:06:17.775354 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-5cb86f4b6b-mmswh"] Mar 21 04:06:17 crc kubenswrapper[4685]: I0321 04:06:17.776378 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-keystone-listener-5cb86f4b6b-mmswh" Mar 21 04:06:17 crc kubenswrapper[4685]: I0321 04:06:17.784173 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-worker-76848f96c5-bkdvl"] Mar 21 04:06:17 crc kubenswrapper[4685]: I0321 04:06:17.795723 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-5cb86f4b6b-mmswh"] Mar 21 04:06:17 crc kubenswrapper[4685]: I0321 04:06:17.846287 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6ec89a16-8152-4ea8-82b3-2ace0ec166e0-config-data-custom\") pod \"barbican-keystone-listener-5cb86f4b6b-mmswh\" (UID: \"6ec89a16-8152-4ea8-82b3-2ace0ec166e0\") " pod="barbican-kuttl-tests/barbican-keystone-listener-5cb86f4b6b-mmswh" Mar 21 04:06:17 crc kubenswrapper[4685]: I0321 04:06:17.846331 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ec89a16-8152-4ea8-82b3-2ace0ec166e0-config-data\") pod \"barbican-keystone-listener-5cb86f4b6b-mmswh\" (UID: \"6ec89a16-8152-4ea8-82b3-2ace0ec166e0\") " pod="barbican-kuttl-tests/barbican-keystone-listener-5cb86f4b6b-mmswh" Mar 21 04:06:17 crc kubenswrapper[4685]: I0321 04:06:17.846354 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7abf263e-1caf-4c51-b312-0246e3f5d374-logs\") pod \"barbican-worker-76848f96c5-bkdvl\" (UID: \"7abf263e-1caf-4c51-b312-0246e3f5d374\") " pod="barbican-kuttl-tests/barbican-worker-76848f96c5-bkdvl" Mar 21 04:06:17 crc kubenswrapper[4685]: I0321 04:06:17.846424 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7abf263e-1caf-4c51-b312-0246e3f5d374-config-data\") pod \"barbican-worker-76848f96c5-bkdvl\" (UID: \"7abf263e-1caf-4c51-b312-0246e3f5d374\") " pod="barbican-kuttl-tests/barbican-worker-76848f96c5-bkdvl" Mar 21 04:06:17 crc kubenswrapper[4685]: I0321 04:06:17.846441 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7abf263e-1caf-4c51-b312-0246e3f5d374-combined-ca-bundle\") pod \"barbican-worker-76848f96c5-bkdvl\" (UID: \"7abf263e-1caf-4c51-b312-0246e3f5d374\") " pod="barbican-kuttl-tests/barbican-worker-76848f96c5-bkdvl" Mar 21 04:06:17 crc kubenswrapper[4685]: I0321 04:06:17.846466 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ec89a16-8152-4ea8-82b3-2ace0ec166e0-combined-ca-bundle\") pod \"barbican-keystone-listener-5cb86f4b6b-mmswh\" (UID: \"6ec89a16-8152-4ea8-82b3-2ace0ec166e0\") " pod="barbican-kuttl-tests/barbican-keystone-listener-5cb86f4b6b-mmswh" Mar 21 04:06:17 crc kubenswrapper[4685]: I0321 04:06:17.846552 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvbkp\" (UniqueName: \"kubernetes.io/projected/6ec89a16-8152-4ea8-82b3-2ace0ec166e0-kube-api-access-dvbkp\") pod \"barbican-keystone-listener-5cb86f4b6b-mmswh\" (UID: \"6ec89a16-8152-4ea8-82b3-2ace0ec166e0\") " pod="barbican-kuttl-tests/barbican-keystone-listener-5cb86f4b6b-mmswh" Mar 21 04:06:17 crc kubenswrapper[4685]: I0321 04:06:17.846612 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ec89a16-8152-4ea8-82b3-2ace0ec166e0-logs\") pod \"barbican-keystone-listener-5cb86f4b6b-mmswh\" (UID: \"6ec89a16-8152-4ea8-82b3-2ace0ec166e0\") " pod="barbican-kuttl-tests/barbican-keystone-listener-5cb86f4b6b-mmswh" Mar 21 04:06:17 crc kubenswrapper[4685]: I0321 04:06:17.846761 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h24cx\" (UniqueName: \"kubernetes.io/projected/7abf263e-1caf-4c51-b312-0246e3f5d374-kube-api-access-h24cx\") pod \"barbican-worker-76848f96c5-bkdvl\" (UID: \"7abf263e-1caf-4c51-b312-0246e3f5d374\") " pod="barbican-kuttl-tests/barbican-worker-76848f96c5-bkdvl" Mar 21 04:06:17 crc kubenswrapper[4685]: I0321 04:06:17.846928 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7abf263e-1caf-4c51-b312-0246e3f5d374-config-data-custom\") pod \"barbican-worker-76848f96c5-bkdvl\" (UID: \"7abf263e-1caf-4c51-b312-0246e3f5d374\") " pod="barbican-kuttl-tests/barbican-worker-76848f96c5-bkdvl" Mar 21 04:06:17 crc kubenswrapper[4685]: I0321 04:06:17.892753 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/barbican-api-7bddf7b4cd-8bsrn"] Mar 21 04:06:17 crc kubenswrapper[4685]: I0321 04:06:17.894285 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-api-7bddf7b4cd-8bsrn" Mar 21 04:06:17 crc kubenswrapper[4685]: I0321 04:06:17.897269 4685 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"cert-barbican-public-svc" Mar 21 04:06:17 crc kubenswrapper[4685]: I0321 04:06:17.897421 4685 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"barbican-api-config-data" Mar 21 04:06:17 crc kubenswrapper[4685]: I0321 04:06:17.904016 4685 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"cert-barbican-internal-svc" Mar 21 04:06:17 crc kubenswrapper[4685]: I0321 04:06:17.906886 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-api-7bddf7b4cd-8bsrn"] Mar 21 04:06:17 crc kubenswrapper[4685]: I0321 04:06:17.947817 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h24cx\" (UniqueName: \"kubernetes.io/projected/7abf263e-1caf-4c51-b312-0246e3f5d374-kube-api-access-h24cx\") pod \"barbican-worker-76848f96c5-bkdvl\" (UID: \"7abf263e-1caf-4c51-b312-0246e3f5d374\") " pod="barbican-kuttl-tests/barbican-worker-76848f96c5-bkdvl" Mar 21 04:06:17 crc kubenswrapper[4685]: I0321 04:06:17.947901 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53773579-45fd-485e-b876-2f1217ebe807-config-data\") pod \"barbican-api-7bddf7b4cd-8bsrn\" (UID: \"53773579-45fd-485e-b876-2f1217ebe807\") " pod="barbican-kuttl-tests/barbican-api-7bddf7b4cd-8bsrn" Mar 21 04:06:17 crc kubenswrapper[4685]: I0321 04:06:17.947929 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/53773579-45fd-485e-b876-2f1217ebe807-internal-tls-certs\") pod \"barbican-api-7bddf7b4cd-8bsrn\" (UID: \"53773579-45fd-485e-b876-2f1217ebe807\") " pod="barbican-kuttl-tests/barbican-api-7bddf7b4cd-8bsrn" Mar 21 04:06:17 crc kubenswrapper[4685]: I0321 04:06:17.947955 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/53773579-45fd-485e-b876-2f1217ebe807-config-data-custom\") pod \"barbican-api-7bddf7b4cd-8bsrn\" (UID: \"53773579-45fd-485e-b876-2f1217ebe807\") " pod="barbican-kuttl-tests/barbican-api-7bddf7b4cd-8bsrn" Mar 21 04:06:17 crc kubenswrapper[4685]: I0321 04:06:17.948034 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53773579-45fd-485e-b876-2f1217ebe807-logs\") pod \"barbican-api-7bddf7b4cd-8bsrn\" (UID: \"53773579-45fd-485e-b876-2f1217ebe807\") " pod="barbican-kuttl-tests/barbican-api-7bddf7b4cd-8bsrn" Mar 21 04:06:17 crc kubenswrapper[4685]: I0321 04:06:17.948056 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7abf263e-1caf-4c51-b312-0246e3f5d374-config-data-custom\") pod \"barbican-worker-76848f96c5-bkdvl\" (UID: \"7abf263e-1caf-4c51-b312-0246e3f5d374\") " pod="barbican-kuttl-tests/barbican-worker-76848f96c5-bkdvl" Mar 21 04:06:17 crc kubenswrapper[4685]: I0321 04:06:17.948097 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6ec89a16-8152-4ea8-82b3-2ace0ec166e0-config-data-custom\") pod \"barbican-keystone-listener-5cb86f4b6b-mmswh\" (UID: \"6ec89a16-8152-4ea8-82b3-2ace0ec166e0\") " pod="barbican-kuttl-tests/barbican-keystone-listener-5cb86f4b6b-mmswh" Mar 21 04:06:17 crc kubenswrapper[4685]: I0321 04:06:17.948119 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ec89a16-8152-4ea8-82b3-2ace0ec166e0-config-data\") pod \"barbican-keystone-listener-5cb86f4b6b-mmswh\" (UID: \"6ec89a16-8152-4ea8-82b3-2ace0ec166e0\") " pod="barbican-kuttl-tests/barbican-keystone-listener-5cb86f4b6b-mmswh" Mar 21 04:06:17 crc kubenswrapper[4685]: I0321 04:06:17.948740 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7abf263e-1caf-4c51-b312-0246e3f5d374-logs\") pod \"barbican-worker-76848f96c5-bkdvl\" (UID: \"7abf263e-1caf-4c51-b312-0246e3f5d374\") " pod="barbican-kuttl-tests/barbican-worker-76848f96c5-bkdvl" Mar 21 04:06:17 crc kubenswrapper[4685]: I0321 04:06:17.948772 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/53773579-45fd-485e-b876-2f1217ebe807-public-tls-certs\") pod \"barbican-api-7bddf7b4cd-8bsrn\" (UID: \"53773579-45fd-485e-b876-2f1217ebe807\") " pod="barbican-kuttl-tests/barbican-api-7bddf7b4cd-8bsrn" Mar 21 04:06:17 crc kubenswrapper[4685]: I0321 04:06:17.948792 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52nrs\" (UniqueName: \"kubernetes.io/projected/53773579-45fd-485e-b876-2f1217ebe807-kube-api-access-52nrs\") pod \"barbican-api-7bddf7b4cd-8bsrn\" (UID: \"53773579-45fd-485e-b876-2f1217ebe807\") " pod="barbican-kuttl-tests/barbican-api-7bddf7b4cd-8bsrn" Mar 21 04:06:17 crc kubenswrapper[4685]: I0321 04:06:17.948822 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7abf263e-1caf-4c51-b312-0246e3f5d374-config-data\") pod \"barbican-worker-76848f96c5-bkdvl\" (UID: \"7abf263e-1caf-4c51-b312-0246e3f5d374\") " pod="barbican-kuttl-tests/barbican-worker-76848f96c5-bkdvl" Mar 21 04:06:17 crc kubenswrapper[4685]: I0321 04:06:17.948855 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7abf263e-1caf-4c51-b312-0246e3f5d374-combined-ca-bundle\") pod \"barbican-worker-76848f96c5-bkdvl\" (UID: \"7abf263e-1caf-4c51-b312-0246e3f5d374\") " pod="barbican-kuttl-tests/barbican-worker-76848f96c5-bkdvl" Mar 21 04:06:17 crc kubenswrapper[4685]: I0321 04:06:17.948883 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ec89a16-8152-4ea8-82b3-2ace0ec166e0-combined-ca-bundle\") pod \"barbican-keystone-listener-5cb86f4b6b-mmswh\" (UID: \"6ec89a16-8152-4ea8-82b3-2ace0ec166e0\") " pod="barbican-kuttl-tests/barbican-keystone-listener-5cb86f4b6b-mmswh" Mar 21 04:06:17 crc kubenswrapper[4685]: I0321 04:06:17.948898 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53773579-45fd-485e-b876-2f1217ebe807-combined-ca-bundle\") pod \"barbican-api-7bddf7b4cd-8bsrn\" (UID: \"53773579-45fd-485e-b876-2f1217ebe807\") " pod="barbican-kuttl-tests/barbican-api-7bddf7b4cd-8bsrn" Mar 21 04:06:17 crc kubenswrapper[4685]: I0321 04:06:17.948924 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvbkp\" (UniqueName: \"kubernetes.io/projected/6ec89a16-8152-4ea8-82b3-2ace0ec166e0-kube-api-access-dvbkp\") pod \"barbican-keystone-listener-5cb86f4b6b-mmswh\" (UID: \"6ec89a16-8152-4ea8-82b3-2ace0ec166e0\") " pod="barbican-kuttl-tests/barbican-keystone-listener-5cb86f4b6b-mmswh" Mar 21 04:06:17 crc kubenswrapper[4685]: I0321 04:06:17.948941 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ec89a16-8152-4ea8-82b3-2ace0ec166e0-logs\") pod \"barbican-keystone-listener-5cb86f4b6b-mmswh\" (UID: \"6ec89a16-8152-4ea8-82b3-2ace0ec166e0\") " pod="barbican-kuttl-tests/barbican-keystone-listener-5cb86f4b6b-mmswh" Mar 21 04:06:17 crc kubenswrapper[4685]: I0321 04:06:17.949115 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7abf263e-1caf-4c51-b312-0246e3f5d374-logs\") pod \"barbican-worker-76848f96c5-bkdvl\" (UID: \"7abf263e-1caf-4c51-b312-0246e3f5d374\") " pod="barbican-kuttl-tests/barbican-worker-76848f96c5-bkdvl" Mar 21 04:06:17 crc kubenswrapper[4685]: I0321 04:06:17.949248 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ec89a16-8152-4ea8-82b3-2ace0ec166e0-logs\") pod \"barbican-keystone-listener-5cb86f4b6b-mmswh\" (UID: \"6ec89a16-8152-4ea8-82b3-2ace0ec166e0\") " pod="barbican-kuttl-tests/barbican-keystone-listener-5cb86f4b6b-mmswh" Mar 21 04:06:17 crc kubenswrapper[4685]: I0321 04:06:17.953276 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ec89a16-8152-4ea8-82b3-2ace0ec166e0-combined-ca-bundle\") pod \"barbican-keystone-listener-5cb86f4b6b-mmswh\" (UID: \"6ec89a16-8152-4ea8-82b3-2ace0ec166e0\") " pod="barbican-kuttl-tests/barbican-keystone-listener-5cb86f4b6b-mmswh" Mar 21 04:06:17 crc kubenswrapper[4685]: I0321 04:06:17.953859 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ec89a16-8152-4ea8-82b3-2ace0ec166e0-config-data\") pod \"barbican-keystone-listener-5cb86f4b6b-mmswh\" (UID: \"6ec89a16-8152-4ea8-82b3-2ace0ec166e0\") " pod="barbican-kuttl-tests/barbican-keystone-listener-5cb86f4b6b-mmswh" Mar 21 04:06:17 crc kubenswrapper[4685]: I0321 04:06:17.954118 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7abf263e-1caf-4c51-b312-0246e3f5d374-config-data-custom\") pod \"barbican-worker-76848f96c5-bkdvl\" (UID: \"7abf263e-1caf-4c51-b312-0246e3f5d374\") " pod="barbican-kuttl-tests/barbican-worker-76848f96c5-bkdvl" Mar 21 04:06:17 crc kubenswrapper[4685]: I0321 04:06:17.954409 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7abf263e-1caf-4c51-b312-0246e3f5d374-config-data\") pod \"barbican-worker-76848f96c5-bkdvl\" (UID: \"7abf263e-1caf-4c51-b312-0246e3f5d374\") " pod="barbican-kuttl-tests/barbican-worker-76848f96c5-bkdvl" Mar 21 04:06:17 crc kubenswrapper[4685]: I0321 04:06:17.956150 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7abf263e-1caf-4c51-b312-0246e3f5d374-combined-ca-bundle\") pod \"barbican-worker-76848f96c5-bkdvl\" (UID: \"7abf263e-1caf-4c51-b312-0246e3f5d374\") " pod="barbican-kuttl-tests/barbican-worker-76848f96c5-bkdvl" Mar 21 04:06:17 crc kubenswrapper[4685]: I0321 04:06:17.965746 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvbkp\" (UniqueName: \"kubernetes.io/projected/6ec89a16-8152-4ea8-82b3-2ace0ec166e0-kube-api-access-dvbkp\") pod \"barbican-keystone-listener-5cb86f4b6b-mmswh\" (UID: \"6ec89a16-8152-4ea8-82b3-2ace0ec166e0\") " pod="barbican-kuttl-tests/barbican-keystone-listener-5cb86f4b6b-mmswh" Mar 21 04:06:17 crc kubenswrapper[4685]: I0321 04:06:17.966117 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h24cx\" (UniqueName: \"kubernetes.io/projected/7abf263e-1caf-4c51-b312-0246e3f5d374-kube-api-access-h24cx\") pod \"barbican-worker-76848f96c5-bkdvl\" (UID: \"7abf263e-1caf-4c51-b312-0246e3f5d374\") " pod="barbican-kuttl-tests/barbican-worker-76848f96c5-bkdvl" Mar 21 04:06:17 crc kubenswrapper[4685]: I0321 04:06:17.973568 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6ec89a16-8152-4ea8-82b3-2ace0ec166e0-config-data-custom\") pod \"barbican-keystone-listener-5cb86f4b6b-mmswh\" (UID: \"6ec89a16-8152-4ea8-82b3-2ace0ec166e0\") " pod="barbican-kuttl-tests/barbican-keystone-listener-5cb86f4b6b-mmswh" Mar 21 04:06:18 crc kubenswrapper[4685]: I0321 04:06:18.050475 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/53773579-45fd-485e-b876-2f1217ebe807-public-tls-certs\") pod \"barbican-api-7bddf7b4cd-8bsrn\" (UID: \"53773579-45fd-485e-b876-2f1217ebe807\") " pod="barbican-kuttl-tests/barbican-api-7bddf7b4cd-8bsrn" Mar 21 04:06:18 crc kubenswrapper[4685]: I0321 04:06:18.050514 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52nrs\" (UniqueName: \"kubernetes.io/projected/53773579-45fd-485e-b876-2f1217ebe807-kube-api-access-52nrs\") pod \"barbican-api-7bddf7b4cd-8bsrn\" (UID: \"53773579-45fd-485e-b876-2f1217ebe807\") " pod="barbican-kuttl-tests/barbican-api-7bddf7b4cd-8bsrn" Mar 21 04:06:18 crc kubenswrapper[4685]: I0321 04:06:18.050559 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53773579-45fd-485e-b876-2f1217ebe807-combined-ca-bundle\") pod \"barbican-api-7bddf7b4cd-8bsrn\" (UID: \"53773579-45fd-485e-b876-2f1217ebe807\") " pod="barbican-kuttl-tests/barbican-api-7bddf7b4cd-8bsrn" Mar 21 04:06:18 crc kubenswrapper[4685]: I0321 04:06:18.050612 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53773579-45fd-485e-b876-2f1217ebe807-config-data\") pod \"barbican-api-7bddf7b4cd-8bsrn\" (UID: \"53773579-45fd-485e-b876-2f1217ebe807\") " pod="barbican-kuttl-tests/barbican-api-7bddf7b4cd-8bsrn" Mar 21 04:06:18 crc kubenswrapper[4685]: I0321 04:06:18.050633 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/53773579-45fd-485e-b876-2f1217ebe807-internal-tls-certs\") pod \"barbican-api-7bddf7b4cd-8bsrn\" (UID: \"53773579-45fd-485e-b876-2f1217ebe807\") " pod="barbican-kuttl-tests/barbican-api-7bddf7b4cd-8bsrn" Mar 21 04:06:18 crc kubenswrapper[4685]: I0321 04:06:18.050658 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/53773579-45fd-485e-b876-2f1217ebe807-config-data-custom\") pod \"barbican-api-7bddf7b4cd-8bsrn\" (UID: \"53773579-45fd-485e-b876-2f1217ebe807\") " pod="barbican-kuttl-tests/barbican-api-7bddf7b4cd-8bsrn" Mar 21 04:06:18 crc kubenswrapper[4685]: I0321 04:06:18.050672 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53773579-45fd-485e-b876-2f1217ebe807-logs\") pod \"barbican-api-7bddf7b4cd-8bsrn\" (UID: \"53773579-45fd-485e-b876-2f1217ebe807\") " pod="barbican-kuttl-tests/barbican-api-7bddf7b4cd-8bsrn" Mar 21 04:06:18 crc kubenswrapper[4685]: I0321 04:06:18.051091 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53773579-45fd-485e-b876-2f1217ebe807-logs\") pod \"barbican-api-7bddf7b4cd-8bsrn\" (UID: \"53773579-45fd-485e-b876-2f1217ebe807\") " pod="barbican-kuttl-tests/barbican-api-7bddf7b4cd-8bsrn" Mar 21 04:06:18 crc kubenswrapper[4685]: I0321 04:06:18.054126 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/53773579-45fd-485e-b876-2f1217ebe807-config-data-custom\") pod \"barbican-api-7bddf7b4cd-8bsrn\" (UID: \"53773579-45fd-485e-b876-2f1217ebe807\") " pod="barbican-kuttl-tests/barbican-api-7bddf7b4cd-8bsrn" Mar 21 04:06:18 crc kubenswrapper[4685]: I0321 04:06:18.054172 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53773579-45fd-485e-b876-2f1217ebe807-combined-ca-bundle\") pod \"barbican-api-7bddf7b4cd-8bsrn\" (UID: \"53773579-45fd-485e-b876-2f1217ebe807\") " pod="barbican-kuttl-tests/barbican-api-7bddf7b4cd-8bsrn" Mar 21 04:06:18 crc kubenswrapper[4685]: I0321 04:06:18.054492 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53773579-45fd-485e-b876-2f1217ebe807-config-data\") pod \"barbican-api-7bddf7b4cd-8bsrn\" (UID: \"53773579-45fd-485e-b876-2f1217ebe807\") " pod="barbican-kuttl-tests/barbican-api-7bddf7b4cd-8bsrn" Mar 21 04:06:18 crc kubenswrapper[4685]: I0321 04:06:18.062302 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/53773579-45fd-485e-b876-2f1217ebe807-internal-tls-certs\") pod \"barbican-api-7bddf7b4cd-8bsrn\" (UID: \"53773579-45fd-485e-b876-2f1217ebe807\") " pod="barbican-kuttl-tests/barbican-api-7bddf7b4cd-8bsrn" Mar 21 04:06:18 crc kubenswrapper[4685]: I0321 04:06:18.068617 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/53773579-45fd-485e-b876-2f1217ebe807-public-tls-certs\") pod \"barbican-api-7bddf7b4cd-8bsrn\" (UID: \"53773579-45fd-485e-b876-2f1217ebe807\") " pod="barbican-kuttl-tests/barbican-api-7bddf7b4cd-8bsrn" Mar 21 04:06:18 crc kubenswrapper[4685]: I0321 04:06:18.071060 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52nrs\" (UniqueName: \"kubernetes.io/projected/53773579-45fd-485e-b876-2f1217ebe807-kube-api-access-52nrs\") pod \"barbican-api-7bddf7b4cd-8bsrn\" (UID: \"53773579-45fd-485e-b876-2f1217ebe807\") " pod="barbican-kuttl-tests/barbican-api-7bddf7b4cd-8bsrn" Mar 21 04:06:18 crc kubenswrapper[4685]: I0321 04:06:18.091788 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-worker-76848f96c5-bkdvl" Mar 21 04:06:18 crc kubenswrapper[4685]: I0321 04:06:18.103973 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-keystone-listener-5cb86f4b6b-mmswh" Mar 21 04:06:18 crc kubenswrapper[4685]: I0321 04:06:18.206684 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-api-7bddf7b4cd-8bsrn" Mar 21 04:06:23 crc kubenswrapper[4685]: I0321 04:06:18.462323 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-api-7bddf7b4cd-8bsrn"] Mar 21 04:06:23 crc kubenswrapper[4685]: I0321 04:06:18.555885 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-db-sync-f8frz"] Mar 21 04:06:23 crc kubenswrapper[4685]: I0321 04:06:18.566097 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/barbican-db-sync-f8frz"] Mar 21 04:06:23 crc kubenswrapper[4685]: I0321 04:06:18.595480 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/barbicand8e1-account-delete-vjx8v"] Mar 21 04:06:23 crc kubenswrapper[4685]: I0321 04:06:18.596502 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbicand8e1-account-delete-vjx8v" Mar 21 04:06:23 crc kubenswrapper[4685]: I0321 04:06:18.611003 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbicand8e1-account-delete-vjx8v"] Mar 21 04:06:23 crc kubenswrapper[4685]: I0321 04:06:18.623089 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-worker-76848f96c5-bkdvl"] Mar 21 04:06:23 crc kubenswrapper[4685]: I0321 04:06:18.657739 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eafc03fc-f5ca-4c3a-bd90-d9cac3ab3edb-operator-scripts\") pod \"barbicand8e1-account-delete-vjx8v\" (UID: \"eafc03fc-f5ca-4c3a-bd90-d9cac3ab3edb\") " pod="barbican-kuttl-tests/barbicand8e1-account-delete-vjx8v" Mar 21 04:06:23 crc kubenswrapper[4685]: I0321 04:06:18.657864 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xvh7\" (UniqueName: \"kubernetes.io/projected/eafc03fc-f5ca-4c3a-bd90-d9cac3ab3edb-kube-api-access-9xvh7\") pod \"barbicand8e1-account-delete-vjx8v\" (UID: \"eafc03fc-f5ca-4c3a-bd90-d9cac3ab3edb\") " pod="barbican-kuttl-tests/barbicand8e1-account-delete-vjx8v" Mar 21 04:06:23 crc kubenswrapper[4685]: I0321 04:06:18.658068 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-5cb86f4b6b-mmswh"] Mar 21 04:06:23 crc kubenswrapper[4685]: I0321 04:06:18.676718 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-worker-76848f96c5-bkdvl"] Mar 21 04:06:23 crc kubenswrapper[4685]: I0321 04:06:18.693953 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-api-7bddf7b4cd-8bsrn"] Mar 21 04:06:23 crc kubenswrapper[4685]: I0321 04:06:18.705867 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-5cb86f4b6b-mmswh"] Mar 21 04:06:23 crc kubenswrapper[4685]: I0321 04:06:18.759847 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xvh7\" (UniqueName: \"kubernetes.io/projected/eafc03fc-f5ca-4c3a-bd90-d9cac3ab3edb-kube-api-access-9xvh7\") pod \"barbicand8e1-account-delete-vjx8v\" (UID: \"eafc03fc-f5ca-4c3a-bd90-d9cac3ab3edb\") " pod="barbican-kuttl-tests/barbicand8e1-account-delete-vjx8v" Mar 21 04:06:23 crc kubenswrapper[4685]: I0321 04:06:18.760404 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eafc03fc-f5ca-4c3a-bd90-d9cac3ab3edb-operator-scripts\") pod \"barbicand8e1-account-delete-vjx8v\" (UID: \"eafc03fc-f5ca-4c3a-bd90-d9cac3ab3edb\") " pod="barbican-kuttl-tests/barbicand8e1-account-delete-vjx8v" Mar 21 04:06:23 crc kubenswrapper[4685]: I0321 04:06:18.761164 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eafc03fc-f5ca-4c3a-bd90-d9cac3ab3edb-operator-scripts\") pod \"barbicand8e1-account-delete-vjx8v\" (UID: \"eafc03fc-f5ca-4c3a-bd90-d9cac3ab3edb\") " pod="barbican-kuttl-tests/barbicand8e1-account-delete-vjx8v" Mar 21 04:06:23 crc kubenswrapper[4685]: I0321 04:06:18.783658 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xvh7\" (UniqueName: \"kubernetes.io/projected/eafc03fc-f5ca-4c3a-bd90-d9cac3ab3edb-kube-api-access-9xvh7\") pod \"barbicand8e1-account-delete-vjx8v\" (UID: \"eafc03fc-f5ca-4c3a-bd90-d9cac3ab3edb\") " pod="barbican-kuttl-tests/barbicand8e1-account-delete-vjx8v" Mar 21 04:06:23 crc kubenswrapper[4685]: I0321 04:06:18.909501 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbicand8e1-account-delete-vjx8v" Mar 21 04:06:23 crc kubenswrapper[4685]: I0321 04:06:19.318715 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-5cb86f4b6b-mmswh" event={"ID":"6ec89a16-8152-4ea8-82b3-2ace0ec166e0","Type":"ContainerStarted","Data":"c6054c68a424789d66d5df2a1d42aa7855bb9fc484df06c06032065702880ad2"} Mar 21 04:06:23 crc kubenswrapper[4685]: I0321 04:06:19.320204 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-76848f96c5-bkdvl" event={"ID":"7abf263e-1caf-4c51-b312-0246e3f5d374","Type":"ContainerStarted","Data":"d835dad5e3d9573788f877fe4aa9d870df6381ed3abbe874b5e13f02e253c397"} Mar 21 04:06:23 crc kubenswrapper[4685]: I0321 04:06:19.321915 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-7bddf7b4cd-8bsrn" event={"ID":"53773579-45fd-485e-b876-2f1217ebe807","Type":"ContainerStarted","Data":"d536020507bfc23fb2f17e14434e051bbbb319e14261dfbead27adad6c02f264"} Mar 21 04:06:23 crc kubenswrapper[4685]: I0321 04:06:20.309724 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f40812e-2d66-4445-9549-e1431ba3de71" path="/var/lib/kubelet/pods/3f40812e-2d66-4445-9549-e1431ba3de71/volumes" Mar 21 04:06:23 crc kubenswrapper[4685]: I0321 04:06:20.342613 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-76848f96c5-bkdvl" event={"ID":"7abf263e-1caf-4c51-b312-0246e3f5d374","Type":"ContainerStarted","Data":"352fb245e5d35cecab1a4ff376f6c7b12cf24ae5465c9aeb28fc0193d7b53746"} Mar 21 04:06:23 crc kubenswrapper[4685]: I0321 04:06:20.344179 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-7bddf7b4cd-8bsrn" event={"ID":"53773579-45fd-485e-b876-2f1217ebe807","Type":"ContainerStarted","Data":"8bfecfb645f4196e9dccd3ee2296d5172a1a4ab24c737ee806dc3c54ba1dac61"} Mar 21 04:06:23 crc kubenswrapper[4685]: I0321 04:06:20.346031 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-5cb86f4b6b-mmswh" event={"ID":"6ec89a16-8152-4ea8-82b3-2ace0ec166e0","Type":"ContainerStarted","Data":"f2cd686ac86d49332eea5741ac5afc036152e4e2730b0cffbfd2d6bebbbd78e0"} Mar 21 04:06:23 crc kubenswrapper[4685]: I0321 04:06:22.363377 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-5cb86f4b6b-mmswh" event={"ID":"6ec89a16-8152-4ea8-82b3-2ace0ec166e0","Type":"ContainerStarted","Data":"6d8176121d09438ebad2adae26248f7dbba93317ba8e9b014e3d67e90f02a03c"} Mar 21 04:06:23 crc kubenswrapper[4685]: I0321 04:06:22.363454 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-keystone-listener-5cb86f4b6b-mmswh" podUID="6ec89a16-8152-4ea8-82b3-2ace0ec166e0" containerName="barbican-keystone-listener-log" containerID="cri-o://f2cd686ac86d49332eea5741ac5afc036152e4e2730b0cffbfd2d6bebbbd78e0" gracePeriod=30 Mar 21 04:06:23 crc kubenswrapper[4685]: I0321 04:06:22.363527 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-keystone-listener-5cb86f4b6b-mmswh" podUID="6ec89a16-8152-4ea8-82b3-2ace0ec166e0" containerName="barbican-keystone-listener" containerID="cri-o://6d8176121d09438ebad2adae26248f7dbba93317ba8e9b014e3d67e90f02a03c" gracePeriod=30 Mar 21 04:06:23 crc kubenswrapper[4685]: I0321 04:06:22.366200 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-76848f96c5-bkdvl" event={"ID":"7abf263e-1caf-4c51-b312-0246e3f5d374","Type":"ContainerStarted","Data":"7915a83dda864efb67074f26a3bdbf79eaf343e632a7b814a46c79b5d8a1f16b"} Mar 21 04:06:23 crc kubenswrapper[4685]: I0321 04:06:22.366286 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-worker-76848f96c5-bkdvl" podUID="7abf263e-1caf-4c51-b312-0246e3f5d374" containerName="barbican-worker" containerID="cri-o://7915a83dda864efb67074f26a3bdbf79eaf343e632a7b814a46c79b5d8a1f16b" gracePeriod=30 Mar 21 04:06:23 crc kubenswrapper[4685]: I0321 04:06:22.366287 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-worker-76848f96c5-bkdvl" podUID="7abf263e-1caf-4c51-b312-0246e3f5d374" containerName="barbican-worker-log" containerID="cri-o://352fb245e5d35cecab1a4ff376f6c7b12cf24ae5465c9aeb28fc0193d7b53746" gracePeriod=30 Mar 21 04:06:23 crc kubenswrapper[4685]: I0321 04:06:22.368329 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-7bddf7b4cd-8bsrn" event={"ID":"53773579-45fd-485e-b876-2f1217ebe807","Type":"ContainerStarted","Data":"766064c8289fd4eef4ca5074f92d1dfff93525b7fefc5415bdde816b0cca8d18"} Mar 21 04:06:23 crc kubenswrapper[4685]: I0321 04:06:22.368452 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-api-7bddf7b4cd-8bsrn" podUID="53773579-45fd-485e-b876-2f1217ebe807" containerName="barbican-api-log" containerID="cri-o://8bfecfb645f4196e9dccd3ee2296d5172a1a4ab24c737ee806dc3c54ba1dac61" gracePeriod=30 Mar 21 04:06:23 crc kubenswrapper[4685]: I0321 04:06:22.368532 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="barbican-kuttl-tests/barbican-api-7bddf7b4cd-8bsrn" Mar 21 04:06:23 crc kubenswrapper[4685]: I0321 04:06:22.368560 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="barbican-kuttl-tests/barbican-api-7bddf7b4cd-8bsrn" Mar 21 04:06:23 crc kubenswrapper[4685]: I0321 04:06:22.368600 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-api-7bddf7b4cd-8bsrn" podUID="53773579-45fd-485e-b876-2f1217ebe807" containerName="barbican-api" containerID="cri-o://766064c8289fd4eef4ca5074f92d1dfff93525b7fefc5415bdde816b0cca8d18" gracePeriod=30 Mar 21 04:06:23 crc kubenswrapper[4685]: I0321 04:06:22.393480 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/barbican-keystone-listener-5cb86f4b6b-mmswh" podStartSLOduration=5.393463797 podStartE2EDuration="5.393463797s" podCreationTimestamp="2026-03-21 04:06:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:06:22.388667633 +0000 UTC m=+1214.865736435" watchObservedRunningTime="2026-03-21 04:06:22.393463797 +0000 UTC m=+1214.870532589" Mar 21 04:06:23 crc kubenswrapper[4685]: I0321 04:06:22.412175 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/barbican-api-7bddf7b4cd-8bsrn" podStartSLOduration=5.412160281 podStartE2EDuration="5.412160281s" podCreationTimestamp="2026-03-21 04:06:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:06:22.410625258 +0000 UTC m=+1214.887694080" watchObservedRunningTime="2026-03-21 04:06:22.412160281 +0000 UTC m=+1214.889229073" Mar 21 04:06:23 crc kubenswrapper[4685]: I0321 04:06:22.432405 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/barbican-worker-76848f96c5-bkdvl" podStartSLOduration=5.4323803569999995 podStartE2EDuration="5.432380357s" podCreationTimestamp="2026-03-21 04:06:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:06:22.42639986 +0000 UTC m=+1214.903468652" watchObservedRunningTime="2026-03-21 04:06:22.432380357 +0000 UTC m=+1214.909449169" Mar 21 04:06:23 crc kubenswrapper[4685]: I0321 04:06:23.377613 4685 generic.go:334] "Generic (PLEG): container finished" podID="6ec89a16-8152-4ea8-82b3-2ace0ec166e0" containerID="f2cd686ac86d49332eea5741ac5afc036152e4e2730b0cffbfd2d6bebbbd78e0" exitCode=143 Mar 21 04:06:23 crc kubenswrapper[4685]: I0321 04:06:23.377703 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-5cb86f4b6b-mmswh" event={"ID":"6ec89a16-8152-4ea8-82b3-2ace0ec166e0","Type":"ContainerDied","Data":"f2cd686ac86d49332eea5741ac5afc036152e4e2730b0cffbfd2d6bebbbd78e0"} Mar 21 04:06:23 crc kubenswrapper[4685]: I0321 04:06:23.380653 4685 generic.go:334] "Generic (PLEG): container finished" podID="7abf263e-1caf-4c51-b312-0246e3f5d374" containerID="352fb245e5d35cecab1a4ff376f6c7b12cf24ae5465c9aeb28fc0193d7b53746" exitCode=143 Mar 21 04:06:23 crc kubenswrapper[4685]: I0321 04:06:23.380728 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-76848f96c5-bkdvl" event={"ID":"7abf263e-1caf-4c51-b312-0246e3f5d374","Type":"ContainerDied","Data":"352fb245e5d35cecab1a4ff376f6c7b12cf24ae5465c9aeb28fc0193d7b53746"} Mar 21 04:06:23 crc kubenswrapper[4685]: I0321 04:06:23.382558 4685 generic.go:334] "Generic (PLEG): container finished" podID="53773579-45fd-485e-b876-2f1217ebe807" containerID="766064c8289fd4eef4ca5074f92d1dfff93525b7fefc5415bdde816b0cca8d18" exitCode=0 Mar 21 04:06:23 crc kubenswrapper[4685]: I0321 04:06:23.382573 4685 generic.go:334] "Generic (PLEG): container finished" podID="53773579-45fd-485e-b876-2f1217ebe807" containerID="8bfecfb645f4196e9dccd3ee2296d5172a1a4ab24c737ee806dc3c54ba1dac61" exitCode=143 Mar 21 04:06:23 crc kubenswrapper[4685]: I0321 04:06:23.382593 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-7bddf7b4cd-8bsrn" event={"ID":"53773579-45fd-485e-b876-2f1217ebe807","Type":"ContainerDied","Data":"766064c8289fd4eef4ca5074f92d1dfff93525b7fefc5415bdde816b0cca8d18"} Mar 21 04:06:23 crc kubenswrapper[4685]: I0321 04:06:23.382616 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-7bddf7b4cd-8bsrn" event={"ID":"53773579-45fd-485e-b876-2f1217ebe807","Type":"ContainerDied","Data":"8bfecfb645f4196e9dccd3ee2296d5172a1a4ab24c737ee806dc3c54ba1dac61"} Mar 21 04:06:23 crc kubenswrapper[4685]: I0321 04:06:23.426465 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbicand8e1-account-delete-vjx8v"] Mar 21 04:06:23 crc kubenswrapper[4685]: I0321 04:06:23.448265 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-api-7bddf7b4cd-8bsrn" Mar 21 04:06:23 crc kubenswrapper[4685]: I0321 04:06:23.523280 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/53773579-45fd-485e-b876-2f1217ebe807-internal-tls-certs\") pod \"53773579-45fd-485e-b876-2f1217ebe807\" (UID: \"53773579-45fd-485e-b876-2f1217ebe807\") " Mar 21 04:06:23 crc kubenswrapper[4685]: I0321 04:06:23.523354 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/53773579-45fd-485e-b876-2f1217ebe807-config-data-custom\") pod \"53773579-45fd-485e-b876-2f1217ebe807\" (UID: \"53773579-45fd-485e-b876-2f1217ebe807\") " Mar 21 04:06:23 crc kubenswrapper[4685]: I0321 04:06:23.523430 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53773579-45fd-485e-b876-2f1217ebe807-logs\") pod \"53773579-45fd-485e-b876-2f1217ebe807\" (UID: \"53773579-45fd-485e-b876-2f1217ebe807\") " Mar 21 04:06:23 crc kubenswrapper[4685]: I0321 04:06:23.523516 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53773579-45fd-485e-b876-2f1217ebe807-config-data\") pod \"53773579-45fd-485e-b876-2f1217ebe807\" (UID: \"53773579-45fd-485e-b876-2f1217ebe807\") " Mar 21 04:06:23 crc kubenswrapper[4685]: I0321 04:06:23.523564 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52nrs\" (UniqueName: \"kubernetes.io/projected/53773579-45fd-485e-b876-2f1217ebe807-kube-api-access-52nrs\") pod \"53773579-45fd-485e-b876-2f1217ebe807\" (UID: \"53773579-45fd-485e-b876-2f1217ebe807\") " Mar 21 04:06:23 crc kubenswrapper[4685]: I0321 04:06:23.523708 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/53773579-45fd-485e-b876-2f1217ebe807-public-tls-certs\") pod \"53773579-45fd-485e-b876-2f1217ebe807\" (UID: \"53773579-45fd-485e-b876-2f1217ebe807\") " Mar 21 04:06:23 crc kubenswrapper[4685]: I0321 04:06:23.523734 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53773579-45fd-485e-b876-2f1217ebe807-combined-ca-bundle\") pod \"53773579-45fd-485e-b876-2f1217ebe807\" (UID: \"53773579-45fd-485e-b876-2f1217ebe807\") " Mar 21 04:06:23 crc kubenswrapper[4685]: I0321 04:06:23.524297 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53773579-45fd-485e-b876-2f1217ebe807-logs" (OuterVolumeSpecName: "logs") pod "53773579-45fd-485e-b876-2f1217ebe807" (UID: "53773579-45fd-485e-b876-2f1217ebe807"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:06:23 crc kubenswrapper[4685]: I0321 04:06:23.528402 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53773579-45fd-485e-b876-2f1217ebe807-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "53773579-45fd-485e-b876-2f1217ebe807" (UID: "53773579-45fd-485e-b876-2f1217ebe807"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:06:23 crc kubenswrapper[4685]: I0321 04:06:23.531228 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53773579-45fd-485e-b876-2f1217ebe807-kube-api-access-52nrs" (OuterVolumeSpecName: "kube-api-access-52nrs") pod "53773579-45fd-485e-b876-2f1217ebe807" (UID: "53773579-45fd-485e-b876-2f1217ebe807"). InnerVolumeSpecName "kube-api-access-52nrs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:06:23 crc kubenswrapper[4685]: I0321 04:06:23.548788 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53773579-45fd-485e-b876-2f1217ebe807-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "53773579-45fd-485e-b876-2f1217ebe807" (UID: "53773579-45fd-485e-b876-2f1217ebe807"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:06:23 crc kubenswrapper[4685]: I0321 04:06:23.558775 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53773579-45fd-485e-b876-2f1217ebe807-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "53773579-45fd-485e-b876-2f1217ebe807" (UID: "53773579-45fd-485e-b876-2f1217ebe807"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:06:23 crc kubenswrapper[4685]: I0321 04:06:23.560091 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53773579-45fd-485e-b876-2f1217ebe807-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "53773579-45fd-485e-b876-2f1217ebe807" (UID: "53773579-45fd-485e-b876-2f1217ebe807"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:06:23 crc kubenswrapper[4685]: I0321 04:06:23.560954 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53773579-45fd-485e-b876-2f1217ebe807-config-data" (OuterVolumeSpecName: "config-data") pod "53773579-45fd-485e-b876-2f1217ebe807" (UID: "53773579-45fd-485e-b876-2f1217ebe807"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:06:23 crc kubenswrapper[4685]: I0321 04:06:23.625830 4685 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53773579-45fd-485e-b876-2f1217ebe807-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:06:23 crc kubenswrapper[4685]: I0321 04:06:23.626201 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-52nrs\" (UniqueName: \"kubernetes.io/projected/53773579-45fd-485e-b876-2f1217ebe807-kube-api-access-52nrs\") on node \"crc\" DevicePath \"\"" Mar 21 04:06:23 crc kubenswrapper[4685]: I0321 04:06:23.626297 4685 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/53773579-45fd-485e-b876-2f1217ebe807-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 21 04:06:23 crc kubenswrapper[4685]: I0321 04:06:23.626377 4685 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53773579-45fd-485e-b876-2f1217ebe807-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:06:23 crc kubenswrapper[4685]: I0321 04:06:23.626444 4685 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/53773579-45fd-485e-b876-2f1217ebe807-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 21 04:06:23 crc kubenswrapper[4685]: I0321 04:06:23.626603 4685 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/53773579-45fd-485e-b876-2f1217ebe807-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 21 04:06:23 crc kubenswrapper[4685]: I0321 04:06:23.626681 4685 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53773579-45fd-485e-b876-2f1217ebe807-logs\") on node \"crc\" DevicePath \"\"" Mar 21 04:06:24 crc kubenswrapper[4685]: I0321 04:06:24.391078 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-7bddf7b4cd-8bsrn" event={"ID":"53773579-45fd-485e-b876-2f1217ebe807","Type":"ContainerDied","Data":"d536020507bfc23fb2f17e14434e051bbbb319e14261dfbead27adad6c02f264"} Mar 21 04:06:24 crc kubenswrapper[4685]: I0321 04:06:24.391974 4685 scope.go:117] "RemoveContainer" containerID="766064c8289fd4eef4ca5074f92d1dfff93525b7fefc5415bdde816b0cca8d18" Mar 21 04:06:24 crc kubenswrapper[4685]: I0321 04:06:24.391132 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-api-7bddf7b4cd-8bsrn" Mar 21 04:06:24 crc kubenswrapper[4685]: I0321 04:06:24.394126 4685 generic.go:334] "Generic (PLEG): container finished" podID="eafc03fc-f5ca-4c3a-bd90-d9cac3ab3edb" containerID="702916a75fb456f3b64f4dcfb2eb966ae9ae6617d4c1e515a270fff3fd4b6224" exitCode=0 Mar 21 04:06:24 crc kubenswrapper[4685]: I0321 04:06:24.394198 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbicand8e1-account-delete-vjx8v" event={"ID":"eafc03fc-f5ca-4c3a-bd90-d9cac3ab3edb","Type":"ContainerDied","Data":"702916a75fb456f3b64f4dcfb2eb966ae9ae6617d4c1e515a270fff3fd4b6224"} Mar 21 04:06:24 crc kubenswrapper[4685]: I0321 04:06:24.394249 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbicand8e1-account-delete-vjx8v" event={"ID":"eafc03fc-f5ca-4c3a-bd90-d9cac3ab3edb","Type":"ContainerStarted","Data":"578d6f98c9a053a2681a87bc60be3966fd31c3b3eaa3505030fa9d00aeb404a2"} Mar 21 04:06:24 crc kubenswrapper[4685]: I0321 04:06:24.417115 4685 scope.go:117] "RemoveContainer" containerID="8bfecfb645f4196e9dccd3ee2296d5172a1a4ab24c737ee806dc3c54ba1dac61" Mar 21 04:06:24 crc kubenswrapper[4685]: I0321 04:06:24.455359 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-api-7bddf7b4cd-8bsrn"] Mar 21 04:06:24 crc kubenswrapper[4685]: I0321 04:06:24.467181 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/barbican-api-7bddf7b4cd-8bsrn"] Mar 21 04:06:25 crc kubenswrapper[4685]: I0321 04:06:25.715010 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbicand8e1-account-delete-vjx8v" Mar 21 04:06:25 crc kubenswrapper[4685]: I0321 04:06:25.773597 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xvh7\" (UniqueName: \"kubernetes.io/projected/eafc03fc-f5ca-4c3a-bd90-d9cac3ab3edb-kube-api-access-9xvh7\") pod \"eafc03fc-f5ca-4c3a-bd90-d9cac3ab3edb\" (UID: \"eafc03fc-f5ca-4c3a-bd90-d9cac3ab3edb\") " Mar 21 04:06:25 crc kubenswrapper[4685]: I0321 04:06:25.773678 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eafc03fc-f5ca-4c3a-bd90-d9cac3ab3edb-operator-scripts\") pod \"eafc03fc-f5ca-4c3a-bd90-d9cac3ab3edb\" (UID: \"eafc03fc-f5ca-4c3a-bd90-d9cac3ab3edb\") " Mar 21 04:06:25 crc kubenswrapper[4685]: I0321 04:06:25.774461 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eafc03fc-f5ca-4c3a-bd90-d9cac3ab3edb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "eafc03fc-f5ca-4c3a-bd90-d9cac3ab3edb" (UID: "eafc03fc-f5ca-4c3a-bd90-d9cac3ab3edb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:06:25 crc kubenswrapper[4685]: I0321 04:06:25.791046 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eafc03fc-f5ca-4c3a-bd90-d9cac3ab3edb-kube-api-access-9xvh7" (OuterVolumeSpecName: "kube-api-access-9xvh7") pod "eafc03fc-f5ca-4c3a-bd90-d9cac3ab3edb" (UID: "eafc03fc-f5ca-4c3a-bd90-d9cac3ab3edb"). InnerVolumeSpecName "kube-api-access-9xvh7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:06:25 crc kubenswrapper[4685]: I0321 04:06:25.875416 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xvh7\" (UniqueName: \"kubernetes.io/projected/eafc03fc-f5ca-4c3a-bd90-d9cac3ab3edb-kube-api-access-9xvh7\") on node \"crc\" DevicePath \"\"" Mar 21 04:06:25 crc kubenswrapper[4685]: I0321 04:06:25.875454 4685 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eafc03fc-f5ca-4c3a-bd90-d9cac3ab3edb-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:06:26 crc kubenswrapper[4685]: I0321 04:06:26.307922 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53773579-45fd-485e-b876-2f1217ebe807" path="/var/lib/kubelet/pods/53773579-45fd-485e-b876-2f1217ebe807/volumes" Mar 21 04:06:26 crc kubenswrapper[4685]: I0321 04:06:26.408911 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbicand8e1-account-delete-vjx8v" event={"ID":"eafc03fc-f5ca-4c3a-bd90-d9cac3ab3edb","Type":"ContainerDied","Data":"578d6f98c9a053a2681a87bc60be3966fd31c3b3eaa3505030fa9d00aeb404a2"} Mar 21 04:06:26 crc kubenswrapper[4685]: I0321 04:06:26.408955 4685 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="578d6f98c9a053a2681a87bc60be3966fd31c3b3eaa3505030fa9d00aeb404a2" Mar 21 04:06:26 crc kubenswrapper[4685]: I0321 04:06:26.408957 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbicand8e1-account-delete-vjx8v" Mar 21 04:06:28 crc kubenswrapper[4685]: I0321 04:06:28.652565 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-db-create-z4q5w"] Mar 21 04:06:28 crc kubenswrapper[4685]: I0321 04:06:28.660464 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbicand8e1-account-delete-vjx8v"] Mar 21 04:06:28 crc kubenswrapper[4685]: I0321 04:06:28.666964 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-d8e1-account-create-update-hrpbv"] Mar 21 04:06:28 crc kubenswrapper[4685]: I0321 04:06:28.675945 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/barbicand8e1-account-delete-vjx8v"] Mar 21 04:06:28 crc kubenswrapper[4685]: I0321 04:06:28.685760 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/barbican-db-create-z4q5w"] Mar 21 04:06:28 crc kubenswrapper[4685]: I0321 04:06:28.692776 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/barbican-d8e1-account-create-update-hrpbv"] Mar 21 04:06:30 crc kubenswrapper[4685]: I0321 04:06:30.308603 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4048b1a3-3a1f-40c7-be26-3f917fb8b0bd" path="/var/lib/kubelet/pods/4048b1a3-3a1f-40c7-be26-3f917fb8b0bd/volumes" Mar 21 04:06:30 crc kubenswrapper[4685]: I0321 04:06:30.309597 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68904f9b-5534-4133-92d8-37d36c09d27d" path="/var/lib/kubelet/pods/68904f9b-5534-4133-92d8-37d36c09d27d/volumes" Mar 21 04:06:30 crc kubenswrapper[4685]: I0321 04:06:30.310069 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eafc03fc-f5ca-4c3a-bd90-d9cac3ab3edb" path="/var/lib/kubelet/pods/eafc03fc-f5ca-4c3a-bd90-d9cac3ab3edb/volumes" Mar 21 04:06:31 crc kubenswrapper[4685]: W0321 04:06:31.133878 4685 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod53773579_45fd_485e_b876_2f1217ebe807.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod53773579_45fd_485e_b876_2f1217ebe807.slice: no such file or directory Mar 21 04:06:31 crc kubenswrapper[4685]: W0321 04:06:31.133933 4685 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeafc03fc_f5ca_4c3a_bd90_d9cac3ab3edb.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeafc03fc_f5ca_4c3a_bd90_d9cac3ab3edb.slice: no such file or directory Mar 21 04:06:31 crc kubenswrapper[4685]: I0321 04:06:31.442286 4685 generic.go:334] "Generic (PLEG): container finished" podID="cf854e07-f0f9-4160-9dbc-ccda00f50a21" containerID="bd97aea7a640f3a4b7ab19871ef72a957097598c0d7085456eef047123523924" exitCode=137 Mar 21 04:06:31 crc kubenswrapper[4685]: I0321 04:06:31.442358 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-566574dc7b-8ppvx" event={"ID":"cf854e07-f0f9-4160-9dbc-ccda00f50a21","Type":"ContainerDied","Data":"bd97aea7a640f3a4b7ab19871ef72a957097598c0d7085456eef047123523924"} Mar 21 04:06:31 crc kubenswrapper[4685]: I0321 04:06:31.442391 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-566574dc7b-8ppvx" event={"ID":"cf854e07-f0f9-4160-9dbc-ccda00f50a21","Type":"ContainerDied","Data":"7c98da6f2f706689474ba49a9508852a237f7aacfb5691d14df4db3b1746384a"} Mar 21 04:06:31 crc kubenswrapper[4685]: I0321 04:06:31.442406 4685 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c98da6f2f706689474ba49a9508852a237f7aacfb5691d14df4db3b1746384a" Mar 21 04:06:31 crc kubenswrapper[4685]: I0321 04:06:31.443853 4685 generic.go:334] "Generic (PLEG): container finished" podID="c05bf573-3e4a-4bee-8638-61b2c36dce22" containerID="13ec1b041ce06de426f7f70f380db4dd4f8178435b8c10695fc72aeff679a45f" exitCode=137 Mar 21 04:06:31 crc kubenswrapper[4685]: I0321 04:06:31.443877 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-5d6fc64879-r4p2z" event={"ID":"c05bf573-3e4a-4bee-8638-61b2c36dce22","Type":"ContainerDied","Data":"13ec1b041ce06de426f7f70f380db4dd4f8178435b8c10695fc72aeff679a45f"} Mar 21 04:06:31 crc kubenswrapper[4685]: I0321 04:06:31.530747 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-keystone-listener-566574dc7b-8ppvx" Mar 21 04:06:31 crc kubenswrapper[4685]: I0321 04:06:31.534488 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-worker-5d6fc64879-r4p2z" Mar 21 04:06:31 crc kubenswrapper[4685]: I0321 04:06:31.656713 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jcpbg\" (UniqueName: \"kubernetes.io/projected/cf854e07-f0f9-4160-9dbc-ccda00f50a21-kube-api-access-jcpbg\") pod \"cf854e07-f0f9-4160-9dbc-ccda00f50a21\" (UID: \"cf854e07-f0f9-4160-9dbc-ccda00f50a21\") " Mar 21 04:06:31 crc kubenswrapper[4685]: I0321 04:06:31.656778 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c05bf573-3e4a-4bee-8638-61b2c36dce22-config-data-custom\") pod \"c05bf573-3e4a-4bee-8638-61b2c36dce22\" (UID: \"c05bf573-3e4a-4bee-8638-61b2c36dce22\") " Mar 21 04:06:31 crc kubenswrapper[4685]: I0321 04:06:31.656815 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf854e07-f0f9-4160-9dbc-ccda00f50a21-config-data\") pod \"cf854e07-f0f9-4160-9dbc-ccda00f50a21\" (UID: \"cf854e07-f0f9-4160-9dbc-ccda00f50a21\") " Mar 21 04:06:31 crc kubenswrapper[4685]: I0321 04:06:31.656829 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c05bf573-3e4a-4bee-8638-61b2c36dce22-config-data\") pod \"c05bf573-3e4a-4bee-8638-61b2c36dce22\" (UID: \"c05bf573-3e4a-4bee-8638-61b2c36dce22\") " Mar 21 04:06:31 crc kubenswrapper[4685]: I0321 04:06:31.656878 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cf854e07-f0f9-4160-9dbc-ccda00f50a21-config-data-custom\") pod \"cf854e07-f0f9-4160-9dbc-ccda00f50a21\" (UID: \"cf854e07-f0f9-4160-9dbc-ccda00f50a21\") " Mar 21 04:06:31 crc kubenswrapper[4685]: I0321 04:06:31.656946 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k465r\" (UniqueName: \"kubernetes.io/projected/c05bf573-3e4a-4bee-8638-61b2c36dce22-kube-api-access-k465r\") pod \"c05bf573-3e4a-4bee-8638-61b2c36dce22\" (UID: \"c05bf573-3e4a-4bee-8638-61b2c36dce22\") " Mar 21 04:06:31 crc kubenswrapper[4685]: I0321 04:06:31.656990 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c05bf573-3e4a-4bee-8638-61b2c36dce22-logs\") pod \"c05bf573-3e4a-4bee-8638-61b2c36dce22\" (UID: \"c05bf573-3e4a-4bee-8638-61b2c36dce22\") " Mar 21 04:06:31 crc kubenswrapper[4685]: I0321 04:06:31.657016 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf854e07-f0f9-4160-9dbc-ccda00f50a21-logs\") pod \"cf854e07-f0f9-4160-9dbc-ccda00f50a21\" (UID: \"cf854e07-f0f9-4160-9dbc-ccda00f50a21\") " Mar 21 04:06:31 crc kubenswrapper[4685]: I0321 04:06:31.657510 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf854e07-f0f9-4160-9dbc-ccda00f50a21-logs" (OuterVolumeSpecName: "logs") pod "cf854e07-f0f9-4160-9dbc-ccda00f50a21" (UID: "cf854e07-f0f9-4160-9dbc-ccda00f50a21"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:06:31 crc kubenswrapper[4685]: I0321 04:06:31.657553 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c05bf573-3e4a-4bee-8638-61b2c36dce22-logs" (OuterVolumeSpecName: "logs") pod "c05bf573-3e4a-4bee-8638-61b2c36dce22" (UID: "c05bf573-3e4a-4bee-8638-61b2c36dce22"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:06:31 crc kubenswrapper[4685]: I0321 04:06:31.661987 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c05bf573-3e4a-4bee-8638-61b2c36dce22-kube-api-access-k465r" (OuterVolumeSpecName: "kube-api-access-k465r") pod "c05bf573-3e4a-4bee-8638-61b2c36dce22" (UID: "c05bf573-3e4a-4bee-8638-61b2c36dce22"). InnerVolumeSpecName "kube-api-access-k465r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:06:31 crc kubenswrapper[4685]: I0321 04:06:31.662300 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c05bf573-3e4a-4bee-8638-61b2c36dce22-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c05bf573-3e4a-4bee-8638-61b2c36dce22" (UID: "c05bf573-3e4a-4bee-8638-61b2c36dce22"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:06:31 crc kubenswrapper[4685]: I0321 04:06:31.662299 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf854e07-f0f9-4160-9dbc-ccda00f50a21-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "cf854e07-f0f9-4160-9dbc-ccda00f50a21" (UID: "cf854e07-f0f9-4160-9dbc-ccda00f50a21"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:06:31 crc kubenswrapper[4685]: I0321 04:06:31.662349 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf854e07-f0f9-4160-9dbc-ccda00f50a21-kube-api-access-jcpbg" (OuterVolumeSpecName: "kube-api-access-jcpbg") pod "cf854e07-f0f9-4160-9dbc-ccda00f50a21" (UID: "cf854e07-f0f9-4160-9dbc-ccda00f50a21"). InnerVolumeSpecName "kube-api-access-jcpbg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:06:31 crc kubenswrapper[4685]: I0321 04:06:31.687968 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c05bf573-3e4a-4bee-8638-61b2c36dce22-config-data" (OuterVolumeSpecName: "config-data") pod "c05bf573-3e4a-4bee-8638-61b2c36dce22" (UID: "c05bf573-3e4a-4bee-8638-61b2c36dce22"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:06:31 crc kubenswrapper[4685]: I0321 04:06:31.704806 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf854e07-f0f9-4160-9dbc-ccda00f50a21-config-data" (OuterVolumeSpecName: "config-data") pod "cf854e07-f0f9-4160-9dbc-ccda00f50a21" (UID: "cf854e07-f0f9-4160-9dbc-ccda00f50a21"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:06:31 crc kubenswrapper[4685]: I0321 04:06:31.759227 4685 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c05bf573-3e4a-4bee-8638-61b2c36dce22-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:06:31 crc kubenswrapper[4685]: I0321 04:06:31.759285 4685 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cf854e07-f0f9-4160-9dbc-ccda00f50a21-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 21 04:06:31 crc kubenswrapper[4685]: I0321 04:06:31.759316 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k465r\" (UniqueName: \"kubernetes.io/projected/c05bf573-3e4a-4bee-8638-61b2c36dce22-kube-api-access-k465r\") on node \"crc\" DevicePath \"\"" Mar 21 04:06:31 crc kubenswrapper[4685]: I0321 04:06:31.759341 4685 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c05bf573-3e4a-4bee-8638-61b2c36dce22-logs\") on node \"crc\" DevicePath \"\"" Mar 21 04:06:31 crc kubenswrapper[4685]: I0321 04:06:31.759365 4685 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf854e07-f0f9-4160-9dbc-ccda00f50a21-logs\") on node \"crc\" DevicePath \"\"" Mar 21 04:06:31 crc kubenswrapper[4685]: I0321 04:06:31.759384 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jcpbg\" (UniqueName: \"kubernetes.io/projected/cf854e07-f0f9-4160-9dbc-ccda00f50a21-kube-api-access-jcpbg\") on node \"crc\" DevicePath \"\"" Mar 21 04:06:31 crc kubenswrapper[4685]: I0321 04:06:31.759405 4685 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c05bf573-3e4a-4bee-8638-61b2c36dce22-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 21 04:06:31 crc kubenswrapper[4685]: I0321 04:06:31.759422 4685 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf854e07-f0f9-4160-9dbc-ccda00f50a21-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:06:32 crc kubenswrapper[4685]: I0321 04:06:32.458524 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-5d6fc64879-r4p2z" event={"ID":"c05bf573-3e4a-4bee-8638-61b2c36dce22","Type":"ContainerDied","Data":"16957e5ebcc5faef4c58614487c49c85ac87689f56b8e12f7fe7fd76bc4933ad"} Mar 21 04:06:32 crc kubenswrapper[4685]: I0321 04:06:32.458562 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-keystone-listener-566574dc7b-8ppvx" Mar 21 04:06:32 crc kubenswrapper[4685]: I0321 04:06:32.459891 4685 scope.go:117] "RemoveContainer" containerID="13ec1b041ce06de426f7f70f380db4dd4f8178435b8c10695fc72aeff679a45f" Mar 21 04:06:32 crc kubenswrapper[4685]: I0321 04:06:32.458577 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-worker-5d6fc64879-r4p2z" Mar 21 04:06:32 crc kubenswrapper[4685]: I0321 04:06:32.493628 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-worker-5d6fc64879-r4p2z"] Mar 21 04:06:32 crc kubenswrapper[4685]: I0321 04:06:32.500390 4685 scope.go:117] "RemoveContainer" containerID="4b2374ad56331236fea61b91e79b5f59d5b0f3e71d4d59513faa4d5d0dcd14a5" Mar 21 04:06:32 crc kubenswrapper[4685]: I0321 04:06:32.505105 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/barbican-worker-5d6fc64879-r4p2z"] Mar 21 04:06:32 crc kubenswrapper[4685]: I0321 04:06:32.514201 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-566574dc7b-8ppvx"] Mar 21 04:06:32 crc kubenswrapper[4685]: I0321 04:06:32.520020 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-566574dc7b-8ppvx"] Mar 21 04:06:34 crc kubenswrapper[4685]: I0321 04:06:34.308344 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c05bf573-3e4a-4bee-8638-61b2c36dce22" path="/var/lib/kubelet/pods/c05bf573-3e4a-4bee-8638-61b2c36dce22/volumes" Mar 21 04:06:34 crc kubenswrapper[4685]: I0321 04:06:34.309363 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf854e07-f0f9-4160-9dbc-ccda00f50a21" path="/var/lib/kubelet/pods/cf854e07-f0f9-4160-9dbc-ccda00f50a21/volumes" Mar 21 04:06:38 crc kubenswrapper[4685]: I0321 04:06:38.329264 4685 scope.go:117] "RemoveContainer" containerID="df54b908af7408c6db30b39d212ea30f6e7d3f734d3dbd8e347006a39eef7666" Mar 21 04:06:52 crc kubenswrapper[4685]: I0321 04:06:52.648945 4685 generic.go:334] "Generic (PLEG): container finished" podID="6ec89a16-8152-4ea8-82b3-2ace0ec166e0" containerID="6d8176121d09438ebad2adae26248f7dbba93317ba8e9b014e3d67e90f02a03c" exitCode=137 Mar 21 04:06:52 crc kubenswrapper[4685]: I0321 04:06:52.649472 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-5cb86f4b6b-mmswh" event={"ID":"6ec89a16-8152-4ea8-82b3-2ace0ec166e0","Type":"ContainerDied","Data":"6d8176121d09438ebad2adae26248f7dbba93317ba8e9b014e3d67e90f02a03c"} Mar 21 04:06:52 crc kubenswrapper[4685]: I0321 04:06:52.651104 4685 generic.go:334] "Generic (PLEG): container finished" podID="7abf263e-1caf-4c51-b312-0246e3f5d374" containerID="7915a83dda864efb67074f26a3bdbf79eaf343e632a7b814a46c79b5d8a1f16b" exitCode=137 Mar 21 04:06:52 crc kubenswrapper[4685]: I0321 04:06:52.651128 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-76848f96c5-bkdvl" event={"ID":"7abf263e-1caf-4c51-b312-0246e3f5d374","Type":"ContainerDied","Data":"7915a83dda864efb67074f26a3bdbf79eaf343e632a7b814a46c79b5d8a1f16b"} Mar 21 04:06:52 crc kubenswrapper[4685]: I0321 04:06:52.736398 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-keystone-listener-5cb86f4b6b-mmswh" Mar 21 04:06:52 crc kubenswrapper[4685]: I0321 04:06:52.741664 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-worker-76848f96c5-bkdvl" Mar 21 04:06:52 crc kubenswrapper[4685]: I0321 04:06:52.772556 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ec89a16-8152-4ea8-82b3-2ace0ec166e0-config-data\") pod \"6ec89a16-8152-4ea8-82b3-2ace0ec166e0\" (UID: \"6ec89a16-8152-4ea8-82b3-2ace0ec166e0\") " Mar 21 04:06:52 crc kubenswrapper[4685]: I0321 04:06:52.772675 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7abf263e-1caf-4c51-b312-0246e3f5d374-combined-ca-bundle\") pod \"7abf263e-1caf-4c51-b312-0246e3f5d374\" (UID: \"7abf263e-1caf-4c51-b312-0246e3f5d374\") " Mar 21 04:06:52 crc kubenswrapper[4685]: I0321 04:06:52.772744 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h24cx\" (UniqueName: \"kubernetes.io/projected/7abf263e-1caf-4c51-b312-0246e3f5d374-kube-api-access-h24cx\") pod \"7abf263e-1caf-4c51-b312-0246e3f5d374\" (UID: \"7abf263e-1caf-4c51-b312-0246e3f5d374\") " Mar 21 04:06:52 crc kubenswrapper[4685]: I0321 04:06:52.772772 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7abf263e-1caf-4c51-b312-0246e3f5d374-config-data-custom\") pod \"7abf263e-1caf-4c51-b312-0246e3f5d374\" (UID: \"7abf263e-1caf-4c51-b312-0246e3f5d374\") " Mar 21 04:06:52 crc kubenswrapper[4685]: I0321 04:06:52.772910 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvbkp\" (UniqueName: \"kubernetes.io/projected/6ec89a16-8152-4ea8-82b3-2ace0ec166e0-kube-api-access-dvbkp\") pod \"6ec89a16-8152-4ea8-82b3-2ace0ec166e0\" (UID: \"6ec89a16-8152-4ea8-82b3-2ace0ec166e0\") " Mar 21 04:06:52 crc kubenswrapper[4685]: I0321 04:06:52.772982 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7abf263e-1caf-4c51-b312-0246e3f5d374-logs\") pod \"7abf263e-1caf-4c51-b312-0246e3f5d374\" (UID: \"7abf263e-1caf-4c51-b312-0246e3f5d374\") " Mar 21 04:06:52 crc kubenswrapper[4685]: I0321 04:06:52.773034 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ec89a16-8152-4ea8-82b3-2ace0ec166e0-combined-ca-bundle\") pod \"6ec89a16-8152-4ea8-82b3-2ace0ec166e0\" (UID: \"6ec89a16-8152-4ea8-82b3-2ace0ec166e0\") " Mar 21 04:06:52 crc kubenswrapper[4685]: I0321 04:06:52.773108 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7abf263e-1caf-4c51-b312-0246e3f5d374-config-data\") pod \"7abf263e-1caf-4c51-b312-0246e3f5d374\" (UID: \"7abf263e-1caf-4c51-b312-0246e3f5d374\") " Mar 21 04:06:52 crc kubenswrapper[4685]: I0321 04:06:52.773144 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6ec89a16-8152-4ea8-82b3-2ace0ec166e0-config-data-custom\") pod \"6ec89a16-8152-4ea8-82b3-2ace0ec166e0\" (UID: \"6ec89a16-8152-4ea8-82b3-2ace0ec166e0\") " Mar 21 04:06:52 crc kubenswrapper[4685]: I0321 04:06:52.773166 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ec89a16-8152-4ea8-82b3-2ace0ec166e0-logs\") pod \"6ec89a16-8152-4ea8-82b3-2ace0ec166e0\" (UID: \"6ec89a16-8152-4ea8-82b3-2ace0ec166e0\") " Mar 21 04:06:52 crc kubenswrapper[4685]: I0321 04:06:52.774334 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ec89a16-8152-4ea8-82b3-2ace0ec166e0-logs" (OuterVolumeSpecName: "logs") pod "6ec89a16-8152-4ea8-82b3-2ace0ec166e0" (UID: "6ec89a16-8152-4ea8-82b3-2ace0ec166e0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:06:52 crc kubenswrapper[4685]: I0321 04:06:52.776701 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7abf263e-1caf-4c51-b312-0246e3f5d374-logs" (OuterVolumeSpecName: "logs") pod "7abf263e-1caf-4c51-b312-0246e3f5d374" (UID: "7abf263e-1caf-4c51-b312-0246e3f5d374"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:06:52 crc kubenswrapper[4685]: I0321 04:06:52.781300 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7abf263e-1caf-4c51-b312-0246e3f5d374-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7abf263e-1caf-4c51-b312-0246e3f5d374" (UID: "7abf263e-1caf-4c51-b312-0246e3f5d374"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:06:52 crc kubenswrapper[4685]: I0321 04:06:52.781465 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7abf263e-1caf-4c51-b312-0246e3f5d374-kube-api-access-h24cx" (OuterVolumeSpecName: "kube-api-access-h24cx") pod "7abf263e-1caf-4c51-b312-0246e3f5d374" (UID: "7abf263e-1caf-4c51-b312-0246e3f5d374"). InnerVolumeSpecName "kube-api-access-h24cx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:06:52 crc kubenswrapper[4685]: I0321 04:06:52.798760 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ec89a16-8152-4ea8-82b3-2ace0ec166e0-kube-api-access-dvbkp" (OuterVolumeSpecName: "kube-api-access-dvbkp") pod "6ec89a16-8152-4ea8-82b3-2ace0ec166e0" (UID: "6ec89a16-8152-4ea8-82b3-2ace0ec166e0"). InnerVolumeSpecName "kube-api-access-dvbkp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:06:52 crc kubenswrapper[4685]: I0321 04:06:52.800113 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ec89a16-8152-4ea8-82b3-2ace0ec166e0-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "6ec89a16-8152-4ea8-82b3-2ace0ec166e0" (UID: "6ec89a16-8152-4ea8-82b3-2ace0ec166e0"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:06:52 crc kubenswrapper[4685]: I0321 04:06:52.805375 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ec89a16-8152-4ea8-82b3-2ace0ec166e0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6ec89a16-8152-4ea8-82b3-2ace0ec166e0" (UID: "6ec89a16-8152-4ea8-82b3-2ace0ec166e0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:06:52 crc kubenswrapper[4685]: I0321 04:06:52.815688 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7abf263e-1caf-4c51-b312-0246e3f5d374-config-data" (OuterVolumeSpecName: "config-data") pod "7abf263e-1caf-4c51-b312-0246e3f5d374" (UID: "7abf263e-1caf-4c51-b312-0246e3f5d374"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:06:52 crc kubenswrapper[4685]: I0321 04:06:52.816939 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7abf263e-1caf-4c51-b312-0246e3f5d374-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7abf263e-1caf-4c51-b312-0246e3f5d374" (UID: "7abf263e-1caf-4c51-b312-0246e3f5d374"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:06:52 crc kubenswrapper[4685]: I0321 04:06:52.823817 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ec89a16-8152-4ea8-82b3-2ace0ec166e0-config-data" (OuterVolumeSpecName: "config-data") pod "6ec89a16-8152-4ea8-82b3-2ace0ec166e0" (UID: "6ec89a16-8152-4ea8-82b3-2ace0ec166e0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:06:52 crc kubenswrapper[4685]: I0321 04:06:52.874460 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dvbkp\" (UniqueName: \"kubernetes.io/projected/6ec89a16-8152-4ea8-82b3-2ace0ec166e0-kube-api-access-dvbkp\") on node \"crc\" DevicePath \"\"" Mar 21 04:06:52 crc kubenswrapper[4685]: I0321 04:06:52.874494 4685 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7abf263e-1caf-4c51-b312-0246e3f5d374-logs\") on node \"crc\" DevicePath \"\"" Mar 21 04:06:52 crc kubenswrapper[4685]: I0321 04:06:52.874506 4685 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ec89a16-8152-4ea8-82b3-2ace0ec166e0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:06:52 crc kubenswrapper[4685]: I0321 04:06:52.874516 4685 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7abf263e-1caf-4c51-b312-0246e3f5d374-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:06:52 crc kubenswrapper[4685]: I0321 04:06:52.874525 4685 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6ec89a16-8152-4ea8-82b3-2ace0ec166e0-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 21 04:06:52 crc kubenswrapper[4685]: I0321 04:06:52.874534 4685 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ec89a16-8152-4ea8-82b3-2ace0ec166e0-logs\") on node \"crc\" DevicePath \"\"" Mar 21 04:06:52 crc kubenswrapper[4685]: I0321 04:06:52.874542 4685 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ec89a16-8152-4ea8-82b3-2ace0ec166e0-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:06:52 crc kubenswrapper[4685]: I0321 04:06:52.874550 4685 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7abf263e-1caf-4c51-b312-0246e3f5d374-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:06:52 crc kubenswrapper[4685]: I0321 04:06:52.874557 4685 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7abf263e-1caf-4c51-b312-0246e3f5d374-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 21 04:06:52 crc kubenswrapper[4685]: I0321 04:06:52.874565 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h24cx\" (UniqueName: \"kubernetes.io/projected/7abf263e-1caf-4c51-b312-0246e3f5d374-kube-api-access-h24cx\") on node \"crc\" DevicePath \"\"" Mar 21 04:06:53 crc kubenswrapper[4685]: I0321 04:06:53.661994 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-5cb86f4b6b-mmswh" event={"ID":"6ec89a16-8152-4ea8-82b3-2ace0ec166e0","Type":"ContainerDied","Data":"c6054c68a424789d66d5df2a1d42aa7855bb9fc484df06c06032065702880ad2"} Mar 21 04:06:53 crc kubenswrapper[4685]: I0321 04:06:53.662049 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-keystone-listener-5cb86f4b6b-mmswh" Mar 21 04:06:53 crc kubenswrapper[4685]: I0321 04:06:53.662076 4685 scope.go:117] "RemoveContainer" containerID="6d8176121d09438ebad2adae26248f7dbba93317ba8e9b014e3d67e90f02a03c" Mar 21 04:06:53 crc kubenswrapper[4685]: I0321 04:06:53.665784 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-76848f96c5-bkdvl" event={"ID":"7abf263e-1caf-4c51-b312-0246e3f5d374","Type":"ContainerDied","Data":"d835dad5e3d9573788f877fe4aa9d870df6381ed3abbe874b5e13f02e253c397"} Mar 21 04:06:53 crc kubenswrapper[4685]: I0321 04:06:53.665899 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-worker-76848f96c5-bkdvl" Mar 21 04:06:53 crc kubenswrapper[4685]: I0321 04:06:53.682369 4685 scope.go:117] "RemoveContainer" containerID="f2cd686ac86d49332eea5741ac5afc036152e4e2730b0cffbfd2d6bebbbd78e0" Mar 21 04:06:53 crc kubenswrapper[4685]: I0321 04:06:53.702326 4685 scope.go:117] "RemoveContainer" containerID="7915a83dda864efb67074f26a3bdbf79eaf343e632a7b814a46c79b5d8a1f16b" Mar 21 04:06:53 crc kubenswrapper[4685]: I0321 04:06:53.715955 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-5cb86f4b6b-mmswh"] Mar 21 04:06:53 crc kubenswrapper[4685]: I0321 04:06:53.718063 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-5cb86f4b6b-mmswh"] Mar 21 04:06:53 crc kubenswrapper[4685]: I0321 04:06:53.725672 4685 scope.go:117] "RemoveContainer" containerID="352fb245e5d35cecab1a4ff376f6c7b12cf24ae5465c9aeb28fc0193d7b53746" Mar 21 04:06:53 crc kubenswrapper[4685]: I0321 04:06:53.726369 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-worker-76848f96c5-bkdvl"] Mar 21 04:06:53 crc kubenswrapper[4685]: I0321 04:06:53.731691 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/barbican-worker-76848f96c5-bkdvl"] Mar 21 04:06:54 crc kubenswrapper[4685]: I0321 04:06:54.310602 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ec89a16-8152-4ea8-82b3-2ace0ec166e0" path="/var/lib/kubelet/pods/6ec89a16-8152-4ea8-82b3-2ace0ec166e0/volumes" Mar 21 04:06:54 crc kubenswrapper[4685]: I0321 04:06:54.311658 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7abf263e-1caf-4c51-b312-0246e3f5d374" path="/var/lib/kubelet/pods/7abf263e-1caf-4c51-b312-0246e3f5d374/volumes" Mar 21 04:07:00 crc kubenswrapper[4685]: E0321 04:07:00.289094 4685 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.158:43520->38.102.83.158:42143: write tcp 38.102.83.158:43520->38.102.83.158:42143: write: broken pipe Mar 21 04:07:01 crc kubenswrapper[4685]: I0321 04:07:01.523546 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/keystone-db-sync-cdf4l"] Mar 21 04:07:01 crc kubenswrapper[4685]: I0321 04:07:01.529184 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/keystone-bootstrap-dqllc"] Mar 21 04:07:01 crc kubenswrapper[4685]: I0321 04:07:01.535688 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/keystone-db-sync-cdf4l"] Mar 21 04:07:01 crc kubenswrapper[4685]: I0321 04:07:01.550982 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/keystone-bootstrap-dqllc"] Mar 21 04:07:01 crc kubenswrapper[4685]: I0321 04:07:01.556297 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/keystone-8649f5b8f-mdp78"] Mar 21 04:07:01 crc kubenswrapper[4685]: I0321 04:07:01.556735 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/keystone-8649f5b8f-mdp78" podUID="0b2b4768-f546-4fad-9609-8b01fa7749dc" containerName="keystone-api" containerID="cri-o://22d93438459a6e12b158d3b08c3f4ece277363dc0de50210f3185acb8116f22d" gracePeriod=30 Mar 21 04:07:01 crc kubenswrapper[4685]: I0321 04:07:01.575901 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/keystonee186-account-delete-ppwv7"] Mar 21 04:07:01 crc kubenswrapper[4685]: E0321 04:07:01.576396 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ec89a16-8152-4ea8-82b3-2ace0ec166e0" containerName="barbican-keystone-listener-log" Mar 21 04:07:01 crc kubenswrapper[4685]: I0321 04:07:01.576470 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ec89a16-8152-4ea8-82b3-2ace0ec166e0" containerName="barbican-keystone-listener-log" Mar 21 04:07:01 crc kubenswrapper[4685]: E0321 04:07:01.576524 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c05bf573-3e4a-4bee-8638-61b2c36dce22" containerName="barbican-worker" Mar 21 04:07:01 crc kubenswrapper[4685]: I0321 04:07:01.576571 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="c05bf573-3e4a-4bee-8638-61b2c36dce22" containerName="barbican-worker" Mar 21 04:07:01 crc kubenswrapper[4685]: E0321 04:07:01.576626 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53773579-45fd-485e-b876-2f1217ebe807" containerName="barbican-api-log" Mar 21 04:07:01 crc kubenswrapper[4685]: I0321 04:07:01.576674 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="53773579-45fd-485e-b876-2f1217ebe807" containerName="barbican-api-log" Mar 21 04:07:01 crc kubenswrapper[4685]: E0321 04:07:01.576727 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ec89a16-8152-4ea8-82b3-2ace0ec166e0" containerName="barbican-keystone-listener" Mar 21 04:07:01 crc kubenswrapper[4685]: I0321 04:07:01.576779 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ec89a16-8152-4ea8-82b3-2ace0ec166e0" containerName="barbican-keystone-listener" Mar 21 04:07:01 crc kubenswrapper[4685]: E0321 04:07:01.576847 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53773579-45fd-485e-b876-2f1217ebe807" containerName="barbican-api" Mar 21 04:07:01 crc kubenswrapper[4685]: I0321 04:07:01.576905 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="53773579-45fd-485e-b876-2f1217ebe807" containerName="barbican-api" Mar 21 04:07:01 crc kubenswrapper[4685]: E0321 04:07:01.576958 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eafc03fc-f5ca-4c3a-bd90-d9cac3ab3edb" containerName="mariadb-account-delete" Mar 21 04:07:01 crc kubenswrapper[4685]: I0321 04:07:01.577013 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="eafc03fc-f5ca-4c3a-bd90-d9cac3ab3edb" containerName="mariadb-account-delete" Mar 21 04:07:01 crc kubenswrapper[4685]: E0321 04:07:01.577068 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf854e07-f0f9-4160-9dbc-ccda00f50a21" containerName="barbican-keystone-listener-log" Mar 21 04:07:01 crc kubenswrapper[4685]: I0321 04:07:01.577120 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf854e07-f0f9-4160-9dbc-ccda00f50a21" containerName="barbican-keystone-listener-log" Mar 21 04:07:01 crc kubenswrapper[4685]: E0321 04:07:01.577172 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c05bf573-3e4a-4bee-8638-61b2c36dce22" containerName="barbican-worker-log" Mar 21 04:07:01 crc kubenswrapper[4685]: I0321 04:07:01.577216 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="c05bf573-3e4a-4bee-8638-61b2c36dce22" containerName="barbican-worker-log" Mar 21 04:07:01 crc kubenswrapper[4685]: E0321 04:07:01.577268 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7abf263e-1caf-4c51-b312-0246e3f5d374" containerName="barbican-worker" Mar 21 04:07:01 crc kubenswrapper[4685]: I0321 04:07:01.577317 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="7abf263e-1caf-4c51-b312-0246e3f5d374" containerName="barbican-worker" Mar 21 04:07:01 crc kubenswrapper[4685]: E0321 04:07:01.577371 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf854e07-f0f9-4160-9dbc-ccda00f50a21" containerName="barbican-keystone-listener" Mar 21 04:07:01 crc kubenswrapper[4685]: I0321 04:07:01.577423 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf854e07-f0f9-4160-9dbc-ccda00f50a21" containerName="barbican-keystone-listener" Mar 21 04:07:01 crc kubenswrapper[4685]: E0321 04:07:01.577486 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7abf263e-1caf-4c51-b312-0246e3f5d374" containerName="barbican-worker-log" Mar 21 04:07:01 crc kubenswrapper[4685]: I0321 04:07:01.577548 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="7abf263e-1caf-4c51-b312-0246e3f5d374" containerName="barbican-worker-log" Mar 21 04:07:01 crc kubenswrapper[4685]: I0321 04:07:01.577699 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf854e07-f0f9-4160-9dbc-ccda00f50a21" containerName="barbican-keystone-listener" Mar 21 04:07:01 crc kubenswrapper[4685]: I0321 04:07:01.577770 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="7abf263e-1caf-4c51-b312-0246e3f5d374" containerName="barbican-worker-log" Mar 21 04:07:01 crc kubenswrapper[4685]: I0321 04:07:01.577822 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="c05bf573-3e4a-4bee-8638-61b2c36dce22" containerName="barbican-worker-log" Mar 21 04:07:01 crc kubenswrapper[4685]: I0321 04:07:01.577972 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="53773579-45fd-485e-b876-2f1217ebe807" containerName="barbican-api" Mar 21 04:07:01 crc kubenswrapper[4685]: I0321 04:07:01.578028 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ec89a16-8152-4ea8-82b3-2ace0ec166e0" containerName="barbican-keystone-listener" Mar 21 04:07:01 crc kubenswrapper[4685]: I0321 04:07:01.578078 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf854e07-f0f9-4160-9dbc-ccda00f50a21" containerName="barbican-keystone-listener-log" Mar 21 04:07:01 crc kubenswrapper[4685]: I0321 04:07:01.578126 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="53773579-45fd-485e-b876-2f1217ebe807" containerName="barbican-api-log" Mar 21 04:07:01 crc kubenswrapper[4685]: I0321 04:07:01.578183 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ec89a16-8152-4ea8-82b3-2ace0ec166e0" containerName="barbican-keystone-listener-log" Mar 21 04:07:01 crc kubenswrapper[4685]: I0321 04:07:01.578236 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="7abf263e-1caf-4c51-b312-0246e3f5d374" containerName="barbican-worker" Mar 21 04:07:01 crc kubenswrapper[4685]: I0321 04:07:01.578325 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="eafc03fc-f5ca-4c3a-bd90-d9cac3ab3edb" containerName="mariadb-account-delete" Mar 21 04:07:01 crc kubenswrapper[4685]: I0321 04:07:01.578381 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="c05bf573-3e4a-4bee-8638-61b2c36dce22" containerName="barbican-worker" Mar 21 04:07:01 crc kubenswrapper[4685]: I0321 04:07:01.578891 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystonee186-account-delete-ppwv7" Mar 21 04:07:01 crc kubenswrapper[4685]: I0321 04:07:01.588804 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/keystonee186-account-delete-ppwv7"] Mar 21 04:07:01 crc kubenswrapper[4685]: I0321 04:07:01.701447 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f046207-c975-4417-89f1-650002978bca-operator-scripts\") pod \"keystonee186-account-delete-ppwv7\" (UID: \"1f046207-c975-4417-89f1-650002978bca\") " pod="barbican-kuttl-tests/keystonee186-account-delete-ppwv7" Mar 21 04:07:01 crc kubenswrapper[4685]: I0321 04:07:01.701685 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5klpq\" (UniqueName: \"kubernetes.io/projected/1f046207-c975-4417-89f1-650002978bca-kube-api-access-5klpq\") pod \"keystonee186-account-delete-ppwv7\" (UID: \"1f046207-c975-4417-89f1-650002978bca\") " pod="barbican-kuttl-tests/keystonee186-account-delete-ppwv7" Mar 21 04:07:01 crc kubenswrapper[4685]: I0321 04:07:01.803220 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f046207-c975-4417-89f1-650002978bca-operator-scripts\") pod \"keystonee186-account-delete-ppwv7\" (UID: \"1f046207-c975-4417-89f1-650002978bca\") " pod="barbican-kuttl-tests/keystonee186-account-delete-ppwv7" Mar 21 04:07:01 crc kubenswrapper[4685]: I0321 04:07:01.803307 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5klpq\" (UniqueName: \"kubernetes.io/projected/1f046207-c975-4417-89f1-650002978bca-kube-api-access-5klpq\") pod \"keystonee186-account-delete-ppwv7\" (UID: \"1f046207-c975-4417-89f1-650002978bca\") " pod="barbican-kuttl-tests/keystonee186-account-delete-ppwv7" Mar 21 04:07:01 crc kubenswrapper[4685]: I0321 04:07:01.804123 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f046207-c975-4417-89f1-650002978bca-operator-scripts\") pod \"keystonee186-account-delete-ppwv7\" (UID: \"1f046207-c975-4417-89f1-650002978bca\") " pod="barbican-kuttl-tests/keystonee186-account-delete-ppwv7" Mar 21 04:07:01 crc kubenswrapper[4685]: I0321 04:07:01.821365 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5klpq\" (UniqueName: \"kubernetes.io/projected/1f046207-c975-4417-89f1-650002978bca-kube-api-access-5klpq\") pod \"keystonee186-account-delete-ppwv7\" (UID: \"1f046207-c975-4417-89f1-650002978bca\") " pod="barbican-kuttl-tests/keystonee186-account-delete-ppwv7" Mar 21 04:07:01 crc kubenswrapper[4685]: I0321 04:07:01.899324 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystonee186-account-delete-ppwv7" Mar 21 04:07:02 crc kubenswrapper[4685]: I0321 04:07:02.127337 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/root-account-create-update-4cflz"] Mar 21 04:07:02 crc kubenswrapper[4685]: I0321 04:07:02.134041 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/root-account-create-update-4cflz"] Mar 21 04:07:02 crc kubenswrapper[4685]: I0321 04:07:02.143634 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/root-account-create-update-kfd4c"] Mar 21 04:07:02 crc kubenswrapper[4685]: I0321 04:07:02.144709 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/root-account-create-update-kfd4c" Mar 21 04:07:02 crc kubenswrapper[4685]: I0321 04:07:02.147136 4685 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"openstack-mariadb-root-db-secret" Mar 21 04:07:02 crc kubenswrapper[4685]: I0321 04:07:02.157300 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/openstack-galera-2"] Mar 21 04:07:02 crc kubenswrapper[4685]: I0321 04:07:02.163226 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/openstack-galera-0"] Mar 21 04:07:02 crc kubenswrapper[4685]: I0321 04:07:02.170919 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/openstack-galera-1"] Mar 21 04:07:02 crc kubenswrapper[4685]: I0321 04:07:02.178534 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/root-account-create-update-kfd4c"] Mar 21 04:07:02 crc kubenswrapper[4685]: I0321 04:07:02.203770 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/root-account-create-update-kfd4c"] Mar 21 04:07:02 crc kubenswrapper[4685]: E0321 04:07:02.204279 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-5ch62 operator-scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="barbican-kuttl-tests/root-account-create-update-kfd4c" podUID="f79a543a-5faf-40e2-9854-c5c6f19f4cec" Mar 21 04:07:02 crc kubenswrapper[4685]: I0321 04:07:02.216611 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ch62\" (UniqueName: \"kubernetes.io/projected/f79a543a-5faf-40e2-9854-c5c6f19f4cec-kube-api-access-5ch62\") pod \"root-account-create-update-kfd4c\" (UID: \"f79a543a-5faf-40e2-9854-c5c6f19f4cec\") " pod="barbican-kuttl-tests/root-account-create-update-kfd4c" Mar 21 04:07:02 crc kubenswrapper[4685]: I0321 04:07:02.216779 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f79a543a-5faf-40e2-9854-c5c6f19f4cec-operator-scripts\") pod \"root-account-create-update-kfd4c\" (UID: \"f79a543a-5faf-40e2-9854-c5c6f19f4cec\") " pod="barbican-kuttl-tests/root-account-create-update-kfd4c" Mar 21 04:07:02 crc kubenswrapper[4685]: I0321 04:07:02.287324 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/keystonee186-account-delete-ppwv7"] Mar 21 04:07:02 crc kubenswrapper[4685]: I0321 04:07:02.306092 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/openstack-galera-2" podUID="9ee610c4-8416-4d1c-a6b4-2324f1541b1c" containerName="galera" containerID="cri-o://2f406f57b1aed489f01d7e0d6eabda6a3b46be0f98918dc327c2eb44ae2b8f20" gracePeriod=30 Mar 21 04:07:02 crc kubenswrapper[4685]: I0321 04:07:02.308294 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03969ee5-9f6c-4a50-8a9f-fd651fe30da2" path="/var/lib/kubelet/pods/03969ee5-9f6c-4a50-8a9f-fd651fe30da2/volumes" Mar 21 04:07:02 crc kubenswrapper[4685]: I0321 04:07:02.309293 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38f93032-3939-4916-9b4d-addcd91de7f6" path="/var/lib/kubelet/pods/38f93032-3939-4916-9b4d-addcd91de7f6/volumes" Mar 21 04:07:02 crc kubenswrapper[4685]: I0321 04:07:02.309976 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ff6ee16-ea1a-4725-b13c-ec201554a350" path="/var/lib/kubelet/pods/8ff6ee16-ea1a-4725-b13c-ec201554a350/volumes" Mar 21 04:07:02 crc kubenswrapper[4685]: I0321 04:07:02.318332 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ch62\" (UniqueName: \"kubernetes.io/projected/f79a543a-5faf-40e2-9854-c5c6f19f4cec-kube-api-access-5ch62\") pod \"root-account-create-update-kfd4c\" (UID: \"f79a543a-5faf-40e2-9854-c5c6f19f4cec\") " pod="barbican-kuttl-tests/root-account-create-update-kfd4c" Mar 21 04:07:02 crc kubenswrapper[4685]: I0321 04:07:02.318391 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f79a543a-5faf-40e2-9854-c5c6f19f4cec-operator-scripts\") pod \"root-account-create-update-kfd4c\" (UID: \"f79a543a-5faf-40e2-9854-c5c6f19f4cec\") " pod="barbican-kuttl-tests/root-account-create-update-kfd4c" Mar 21 04:07:02 crc kubenswrapper[4685]: E0321 04:07:02.318526 4685 configmap.go:193] Couldn't get configMap barbican-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Mar 21 04:07:02 crc kubenswrapper[4685]: E0321 04:07:02.318583 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f79a543a-5faf-40e2-9854-c5c6f19f4cec-operator-scripts podName:f79a543a-5faf-40e2-9854-c5c6f19f4cec nodeName:}" failed. No retries permitted until 2026-03-21 04:07:02.818566862 +0000 UTC m=+1255.295635654 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/f79a543a-5faf-40e2-9854-c5c6f19f4cec-operator-scripts") pod "root-account-create-update-kfd4c" (UID: "f79a543a-5faf-40e2-9854-c5c6f19f4cec") : configmap "openstack-scripts" not found Mar 21 04:07:02 crc kubenswrapper[4685]: E0321 04:07:02.329335 4685 projected.go:194] Error preparing data for projected volume kube-api-access-5ch62 for pod barbican-kuttl-tests/root-account-create-update-kfd4c: failed to fetch token: serviceaccounts "galera-openstack" not found Mar 21 04:07:02 crc kubenswrapper[4685]: E0321 04:07:02.329409 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f79a543a-5faf-40e2-9854-c5c6f19f4cec-kube-api-access-5ch62 podName:f79a543a-5faf-40e2-9854-c5c6f19f4cec nodeName:}" failed. No retries permitted until 2026-03-21 04:07:02.829387857 +0000 UTC m=+1255.306456649 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-5ch62" (UniqueName: "kubernetes.io/projected/f79a543a-5faf-40e2-9854-c5c6f19f4cec-kube-api-access-5ch62") pod "root-account-create-update-kfd4c" (UID: "f79a543a-5faf-40e2-9854-c5c6f19f4cec") : failed to fetch token: serviceaccounts "galera-openstack" not found Mar 21 04:07:02 crc kubenswrapper[4685]: I0321 04:07:02.708766 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/memcached-0"] Mar 21 04:07:02 crc kubenswrapper[4685]: I0321 04:07:02.709330 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/memcached-0" podUID="91653387-c2f1-4240-b710-e0c709eb769d" containerName="memcached" containerID="cri-o://78e1e20967447cec136443726f8d2e76ac857183f77e3fec410e0f394f81503a" gracePeriod=30 Mar 21 04:07:02 crc kubenswrapper[4685]: I0321 04:07:02.745392 4685 generic.go:334] "Generic (PLEG): container finished" podID="1f046207-c975-4417-89f1-650002978bca" containerID="b3c849364bb2afcdc769b52ad87ef4c71391dab80d01f127496c1c5eeeb62a40" exitCode=1 Mar 21 04:07:02 crc kubenswrapper[4685]: I0321 04:07:02.745431 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystonee186-account-delete-ppwv7" event={"ID":"1f046207-c975-4417-89f1-650002978bca","Type":"ContainerDied","Data":"b3c849364bb2afcdc769b52ad87ef4c71391dab80d01f127496c1c5eeeb62a40"} Mar 21 04:07:02 crc kubenswrapper[4685]: I0321 04:07:02.745479 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystonee186-account-delete-ppwv7" event={"ID":"1f046207-c975-4417-89f1-650002978bca","Type":"ContainerStarted","Data":"0579e74b9950502a977d96d5ac9eb3bf8539add74541b0f08d58fa5744050c41"} Mar 21 04:07:02 crc kubenswrapper[4685]: I0321 04:07:02.745553 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/root-account-create-update-kfd4c" Mar 21 04:07:02 crc kubenswrapper[4685]: I0321 04:07:02.745937 4685 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="barbican-kuttl-tests/keystonee186-account-delete-ppwv7" secret="" err="secret \"galera-openstack-dockercfg-ntnkl\" not found" Mar 21 04:07:02 crc kubenswrapper[4685]: I0321 04:07:02.745980 4685 scope.go:117] "RemoveContainer" containerID="b3c849364bb2afcdc769b52ad87ef4c71391dab80d01f127496c1c5eeeb62a40" Mar 21 04:07:02 crc kubenswrapper[4685]: I0321 04:07:02.824315 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f79a543a-5faf-40e2-9854-c5c6f19f4cec-operator-scripts\") pod \"root-account-create-update-kfd4c\" (UID: \"f79a543a-5faf-40e2-9854-c5c6f19f4cec\") " pod="barbican-kuttl-tests/root-account-create-update-kfd4c" Mar 21 04:07:02 crc kubenswrapper[4685]: E0321 04:07:02.824411 4685 configmap.go:193] Couldn't get configMap barbican-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Mar 21 04:07:02 crc kubenswrapper[4685]: E0321 04:07:02.824441 4685 configmap.go:193] Couldn't get configMap barbican-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Mar 21 04:07:02 crc kubenswrapper[4685]: E0321 04:07:02.824465 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1f046207-c975-4417-89f1-650002978bca-operator-scripts podName:1f046207-c975-4417-89f1-650002978bca nodeName:}" failed. No retries permitted until 2026-03-21 04:07:03.324450369 +0000 UTC m=+1255.801519161 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/1f046207-c975-4417-89f1-650002978bca-operator-scripts") pod "keystonee186-account-delete-ppwv7" (UID: "1f046207-c975-4417-89f1-650002978bca") : configmap "openstack-scripts" not found Mar 21 04:07:02 crc kubenswrapper[4685]: E0321 04:07:02.824477 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f79a543a-5faf-40e2-9854-c5c6f19f4cec-operator-scripts podName:f79a543a-5faf-40e2-9854-c5c6f19f4cec nodeName:}" failed. No retries permitted until 2026-03-21 04:07:03.82447168 +0000 UTC m=+1256.301540472 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/f79a543a-5faf-40e2-9854-c5c6f19f4cec-operator-scripts") pod "root-account-create-update-kfd4c" (UID: "f79a543a-5faf-40e2-9854-c5c6f19f4cec") : configmap "openstack-scripts" not found Mar 21 04:07:02 crc kubenswrapper[4685]: I0321 04:07:02.911253 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/root-account-create-update-kfd4c" Mar 21 04:07:02 crc kubenswrapper[4685]: I0321 04:07:02.925583 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ch62\" (UniqueName: \"kubernetes.io/projected/f79a543a-5faf-40e2-9854-c5c6f19f4cec-kube-api-access-5ch62\") pod \"root-account-create-update-kfd4c\" (UID: \"f79a543a-5faf-40e2-9854-c5c6f19f4cec\") " pod="barbican-kuttl-tests/root-account-create-update-kfd4c" Mar 21 04:07:02 crc kubenswrapper[4685]: E0321 04:07:02.931223 4685 projected.go:194] Error preparing data for projected volume kube-api-access-5ch62 for pod barbican-kuttl-tests/root-account-create-update-kfd4c: failed to fetch token: serviceaccounts "galera-openstack" not found Mar 21 04:07:02 crc kubenswrapper[4685]: E0321 04:07:02.931435 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f79a543a-5faf-40e2-9854-c5c6f19f4cec-kube-api-access-5ch62 podName:f79a543a-5faf-40e2-9854-c5c6f19f4cec nodeName:}" failed. No retries permitted until 2026-03-21 04:07:03.931378803 +0000 UTC m=+1256.408447585 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-5ch62" (UniqueName: "kubernetes.io/projected/f79a543a-5faf-40e2-9854-c5c6f19f4cec-kube-api-access-5ch62") pod "root-account-create-update-kfd4c" (UID: "f79a543a-5faf-40e2-9854-c5c6f19f4cec") : failed to fetch token: serviceaccounts "galera-openstack" not found Mar 21 04:07:03 crc kubenswrapper[4685]: I0321 04:07:03.049006 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/openstack-galera-2" Mar 21 04:07:03 crc kubenswrapper[4685]: I0321 04:07:03.128406 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"9ee610c4-8416-4d1c-a6b4-2324f1541b1c\" (UID: \"9ee610c4-8416-4d1c-a6b4-2324f1541b1c\") " Mar 21 04:07:03 crc kubenswrapper[4685]: I0321 04:07:03.128710 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9ee610c4-8416-4d1c-a6b4-2324f1541b1c-kolla-config\") pod \"9ee610c4-8416-4d1c-a6b4-2324f1541b1c\" (UID: \"9ee610c4-8416-4d1c-a6b4-2324f1541b1c\") " Mar 21 04:07:03 crc kubenswrapper[4685]: I0321 04:07:03.128740 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ee610c4-8416-4d1c-a6b4-2324f1541b1c-operator-scripts\") pod \"9ee610c4-8416-4d1c-a6b4-2324f1541b1c\" (UID: \"9ee610c4-8416-4d1c-a6b4-2324f1541b1c\") " Mar 21 04:07:03 crc kubenswrapper[4685]: I0321 04:07:03.128765 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9ee610c4-8416-4d1c-a6b4-2324f1541b1c-config-data-default\") pod \"9ee610c4-8416-4d1c-a6b4-2324f1541b1c\" (UID: \"9ee610c4-8416-4d1c-a6b4-2324f1541b1c\") " Mar 21 04:07:03 crc kubenswrapper[4685]: I0321 04:07:03.128799 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48qd2\" (UniqueName: \"kubernetes.io/projected/9ee610c4-8416-4d1c-a6b4-2324f1541b1c-kube-api-access-48qd2\") pod \"9ee610c4-8416-4d1c-a6b4-2324f1541b1c\" (UID: \"9ee610c4-8416-4d1c-a6b4-2324f1541b1c\") " Mar 21 04:07:03 crc kubenswrapper[4685]: I0321 04:07:03.128826 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9ee610c4-8416-4d1c-a6b4-2324f1541b1c-config-data-generated\") pod \"9ee610c4-8416-4d1c-a6b4-2324f1541b1c\" (UID: \"9ee610c4-8416-4d1c-a6b4-2324f1541b1c\") " Mar 21 04:07:03 crc kubenswrapper[4685]: I0321 04:07:03.129496 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ee610c4-8416-4d1c-a6b4-2324f1541b1c-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "9ee610c4-8416-4d1c-a6b4-2324f1541b1c" (UID: "9ee610c4-8416-4d1c-a6b4-2324f1541b1c"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:07:03 crc kubenswrapper[4685]: I0321 04:07:03.129603 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ee610c4-8416-4d1c-a6b4-2324f1541b1c-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "9ee610c4-8416-4d1c-a6b4-2324f1541b1c" (UID: "9ee610c4-8416-4d1c-a6b4-2324f1541b1c"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:07:03 crc kubenswrapper[4685]: I0321 04:07:03.130139 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ee610c4-8416-4d1c-a6b4-2324f1541b1c-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "9ee610c4-8416-4d1c-a6b4-2324f1541b1c" (UID: "9ee610c4-8416-4d1c-a6b4-2324f1541b1c"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:07:03 crc kubenswrapper[4685]: I0321 04:07:03.131076 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ee610c4-8416-4d1c-a6b4-2324f1541b1c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9ee610c4-8416-4d1c-a6b4-2324f1541b1c" (UID: "9ee610c4-8416-4d1c-a6b4-2324f1541b1c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:07:03 crc kubenswrapper[4685]: I0321 04:07:03.134467 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ee610c4-8416-4d1c-a6b4-2324f1541b1c-kube-api-access-48qd2" (OuterVolumeSpecName: "kube-api-access-48qd2") pod "9ee610c4-8416-4d1c-a6b4-2324f1541b1c" (UID: "9ee610c4-8416-4d1c-a6b4-2324f1541b1c"). InnerVolumeSpecName "kube-api-access-48qd2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:07:03 crc kubenswrapper[4685]: I0321 04:07:03.144991 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "mysql-db") pod "9ee610c4-8416-4d1c-a6b4-2324f1541b1c" (UID: "9ee610c4-8416-4d1c-a6b4-2324f1541b1c"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 21 04:07:03 crc kubenswrapper[4685]: I0321 04:07:03.166415 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/rabbitmq-server-0"] Mar 21 04:07:03 crc kubenswrapper[4685]: I0321 04:07:03.230561 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48qd2\" (UniqueName: \"kubernetes.io/projected/9ee610c4-8416-4d1c-a6b4-2324f1541b1c-kube-api-access-48qd2\") on node \"crc\" DevicePath \"\"" Mar 21 04:07:03 crc kubenswrapper[4685]: I0321 04:07:03.230586 4685 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9ee610c4-8416-4d1c-a6b4-2324f1541b1c-config-data-generated\") on node \"crc\" DevicePath \"\"" Mar 21 04:07:03 crc kubenswrapper[4685]: I0321 04:07:03.230603 4685 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Mar 21 04:07:03 crc kubenswrapper[4685]: I0321 04:07:03.230612 4685 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9ee610c4-8416-4d1c-a6b4-2324f1541b1c-kolla-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:07:03 crc kubenswrapper[4685]: I0321 04:07:03.230621 4685 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ee610c4-8416-4d1c-a6b4-2324f1541b1c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:07:03 crc kubenswrapper[4685]: I0321 04:07:03.230631 4685 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9ee610c4-8416-4d1c-a6b4-2324f1541b1c-config-data-default\") on node \"crc\" DevicePath \"\"" Mar 21 04:07:03 crc kubenswrapper[4685]: I0321 04:07:03.241929 4685 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Mar 21 04:07:03 crc kubenswrapper[4685]: I0321 04:07:03.331974 4685 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Mar 21 04:07:03 crc kubenswrapper[4685]: E0321 04:07:03.332077 4685 configmap.go:193] Couldn't get configMap barbican-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Mar 21 04:07:03 crc kubenswrapper[4685]: E0321 04:07:03.332175 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1f046207-c975-4417-89f1-650002978bca-operator-scripts podName:1f046207-c975-4417-89f1-650002978bca nodeName:}" failed. No retries permitted until 2026-03-21 04:07:04.332152096 +0000 UTC m=+1256.809220888 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/1f046207-c975-4417-89f1-650002978bca-operator-scripts") pod "keystonee186-account-delete-ppwv7" (UID: "1f046207-c975-4417-89f1-650002978bca") : configmap "openstack-scripts" not found Mar 21 04:07:03 crc kubenswrapper[4685]: I0321 04:07:03.516703 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/rabbitmq-server-0"] Mar 21 04:07:03 crc kubenswrapper[4685]: I0321 04:07:03.755487 4685 generic.go:334] "Generic (PLEG): container finished" podID="9ee610c4-8416-4d1c-a6b4-2324f1541b1c" containerID="2f406f57b1aed489f01d7e0d6eabda6a3b46be0f98918dc327c2eb44ae2b8f20" exitCode=0 Mar 21 04:07:03 crc kubenswrapper[4685]: I0321 04:07:03.755615 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/openstack-galera-2" Mar 21 04:07:03 crc kubenswrapper[4685]: I0321 04:07:03.755643 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-2" event={"ID":"9ee610c4-8416-4d1c-a6b4-2324f1541b1c","Type":"ContainerDied","Data":"2f406f57b1aed489f01d7e0d6eabda6a3b46be0f98918dc327c2eb44ae2b8f20"} Mar 21 04:07:03 crc kubenswrapper[4685]: I0321 04:07:03.756147 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-2" event={"ID":"9ee610c4-8416-4d1c-a6b4-2324f1541b1c","Type":"ContainerDied","Data":"adc44a604e9766d4f8e09edba3cff5d49780e00f4296f5f7913a9ba0cf852aa9"} Mar 21 04:07:03 crc kubenswrapper[4685]: I0321 04:07:03.756193 4685 scope.go:117] "RemoveContainer" containerID="2f406f57b1aed489f01d7e0d6eabda6a3b46be0f98918dc327c2eb44ae2b8f20" Mar 21 04:07:03 crc kubenswrapper[4685]: I0321 04:07:03.758376 4685 generic.go:334] "Generic (PLEG): container finished" podID="91653387-c2f1-4240-b710-e0c709eb769d" containerID="78e1e20967447cec136443726f8d2e76ac857183f77e3fec410e0f394f81503a" exitCode=0 Mar 21 04:07:03 crc kubenswrapper[4685]: I0321 04:07:03.758453 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/memcached-0" event={"ID":"91653387-c2f1-4240-b710-e0c709eb769d","Type":"ContainerDied","Data":"78e1e20967447cec136443726f8d2e76ac857183f77e3fec410e0f394f81503a"} Mar 21 04:07:03 crc kubenswrapper[4685]: I0321 04:07:03.761900 4685 generic.go:334] "Generic (PLEG): container finished" podID="1f046207-c975-4417-89f1-650002978bca" containerID="32ba97a86558db342b9a82184d606542f267144be2a0a73ed9ba300e6288ca0c" exitCode=1 Mar 21 04:07:03 crc kubenswrapper[4685]: I0321 04:07:03.762288 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystonee186-account-delete-ppwv7" event={"ID":"1f046207-c975-4417-89f1-650002978bca","Type":"ContainerDied","Data":"32ba97a86558db342b9a82184d606542f267144be2a0a73ed9ba300e6288ca0c"} Mar 21 04:07:03 crc kubenswrapper[4685]: I0321 04:07:03.762676 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/root-account-create-update-kfd4c" Mar 21 04:07:03 crc kubenswrapper[4685]: I0321 04:07:03.763409 4685 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="barbican-kuttl-tests/keystonee186-account-delete-ppwv7" secret="" err="secret \"galera-openstack-dockercfg-ntnkl\" not found" Mar 21 04:07:03 crc kubenswrapper[4685]: I0321 04:07:03.763466 4685 scope.go:117] "RemoveContainer" containerID="32ba97a86558db342b9a82184d606542f267144be2a0a73ed9ba300e6288ca0c" Mar 21 04:07:03 crc kubenswrapper[4685]: E0321 04:07:03.763902 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-delete\" with CrashLoopBackOff: \"back-off 10s restarting failed container=mariadb-account-delete pod=keystonee186-account-delete-ppwv7_barbican-kuttl-tests(1f046207-c975-4417-89f1-650002978bca)\"" pod="barbican-kuttl-tests/keystonee186-account-delete-ppwv7" podUID="1f046207-c975-4417-89f1-650002978bca" Mar 21 04:07:03 crc kubenswrapper[4685]: I0321 04:07:03.797493 4685 scope.go:117] "RemoveContainer" containerID="afd181791b9ffd26344f58b9bd55f908ec8be4288ead7e37013df654d16e931b" Mar 21 04:07:03 crc kubenswrapper[4685]: I0321 04:07:03.802304 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/rabbitmq-server-0" podUID="d18693b0-ea2a-4795-a4de-15a379cc8490" containerName="rabbitmq" containerID="cri-o://d744b99d0f599149431f3be36e7e73a2b55609a1d321ca7ef292c0e1275ec425" gracePeriod=604800 Mar 21 04:07:03 crc kubenswrapper[4685]: I0321 04:07:03.836782 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/root-account-create-update-kfd4c"] Mar 21 04:07:03 crc kubenswrapper[4685]: I0321 04:07:03.840163 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f79a543a-5faf-40e2-9854-c5c6f19f4cec-operator-scripts\") pod \"root-account-create-update-kfd4c\" (UID: \"f79a543a-5faf-40e2-9854-c5c6f19f4cec\") " pod="barbican-kuttl-tests/root-account-create-update-kfd4c" Mar 21 04:07:03 crc kubenswrapper[4685]: E0321 04:07:03.840245 4685 configmap.go:193] Couldn't get configMap barbican-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Mar 21 04:07:03 crc kubenswrapper[4685]: E0321 04:07:03.840280 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f79a543a-5faf-40e2-9854-c5c6f19f4cec-operator-scripts podName:f79a543a-5faf-40e2-9854-c5c6f19f4cec nodeName:}" failed. No retries permitted until 2026-03-21 04:07:05.840267656 +0000 UTC m=+1258.317336448 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/f79a543a-5faf-40e2-9854-c5c6f19f4cec-operator-scripts") pod "root-account-create-update-kfd4c" (UID: "f79a543a-5faf-40e2-9854-c5c6f19f4cec") : configmap "openstack-scripts" not found Mar 21 04:07:03 crc kubenswrapper[4685]: I0321 04:07:03.846671 4685 scope.go:117] "RemoveContainer" containerID="2f406f57b1aed489f01d7e0d6eabda6a3b46be0f98918dc327c2eb44ae2b8f20" Mar 21 04:07:03 crc kubenswrapper[4685]: I0321 04:07:03.846898 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/root-account-create-update-kfd4c"] Mar 21 04:07:03 crc kubenswrapper[4685]: E0321 04:07:03.847124 4685 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f406f57b1aed489f01d7e0d6eabda6a3b46be0f98918dc327c2eb44ae2b8f20\": container with ID starting with 2f406f57b1aed489f01d7e0d6eabda6a3b46be0f98918dc327c2eb44ae2b8f20 not found: ID does not exist" containerID="2f406f57b1aed489f01d7e0d6eabda6a3b46be0f98918dc327c2eb44ae2b8f20" Mar 21 04:07:03 crc kubenswrapper[4685]: I0321 04:07:03.847152 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f406f57b1aed489f01d7e0d6eabda6a3b46be0f98918dc327c2eb44ae2b8f20"} err="failed to get container status \"2f406f57b1aed489f01d7e0d6eabda6a3b46be0f98918dc327c2eb44ae2b8f20\": rpc error: code = NotFound desc = could not find container \"2f406f57b1aed489f01d7e0d6eabda6a3b46be0f98918dc327c2eb44ae2b8f20\": container with ID starting with 2f406f57b1aed489f01d7e0d6eabda6a3b46be0f98918dc327c2eb44ae2b8f20 not found: ID does not exist" Mar 21 04:07:03 crc kubenswrapper[4685]: I0321 04:07:03.847170 4685 scope.go:117] "RemoveContainer" containerID="afd181791b9ffd26344f58b9bd55f908ec8be4288ead7e37013df654d16e931b" Mar 21 04:07:03 crc kubenswrapper[4685]: E0321 04:07:03.847499 4685 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afd181791b9ffd26344f58b9bd55f908ec8be4288ead7e37013df654d16e931b\": container with ID starting with afd181791b9ffd26344f58b9bd55f908ec8be4288ead7e37013df654d16e931b not found: ID does not exist" containerID="afd181791b9ffd26344f58b9bd55f908ec8be4288ead7e37013df654d16e931b" Mar 21 04:07:03 crc kubenswrapper[4685]: I0321 04:07:03.847523 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afd181791b9ffd26344f58b9bd55f908ec8be4288ead7e37013df654d16e931b"} err="failed to get container status \"afd181791b9ffd26344f58b9bd55f908ec8be4288ead7e37013df654d16e931b\": rpc error: code = NotFound desc = could not find container \"afd181791b9ffd26344f58b9bd55f908ec8be4288ead7e37013df654d16e931b\": container with ID starting with afd181791b9ffd26344f58b9bd55f908ec8be4288ead7e37013df654d16e931b not found: ID does not exist" Mar 21 04:07:03 crc kubenswrapper[4685]: I0321 04:07:03.847537 4685 scope.go:117] "RemoveContainer" containerID="b3c849364bb2afcdc769b52ad87ef4c71391dab80d01f127496c1c5eeeb62a40" Mar 21 04:07:03 crc kubenswrapper[4685]: I0321 04:07:03.854227 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/openstack-galera-2"] Mar 21 04:07:03 crc kubenswrapper[4685]: I0321 04:07:03.859670 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/openstack-galera-2"] Mar 21 04:07:03 crc kubenswrapper[4685]: I0321 04:07:03.888149 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/memcached-0" Mar 21 04:07:03 crc kubenswrapper[4685]: I0321 04:07:03.941008 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/91653387-c2f1-4240-b710-e0c709eb769d-kolla-config\") pod \"91653387-c2f1-4240-b710-e0c709eb769d\" (UID: \"91653387-c2f1-4240-b710-e0c709eb769d\") " Mar 21 04:07:03 crc kubenswrapper[4685]: I0321 04:07:03.941149 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/91653387-c2f1-4240-b710-e0c709eb769d-config-data\") pod \"91653387-c2f1-4240-b710-e0c709eb769d\" (UID: \"91653387-c2f1-4240-b710-e0c709eb769d\") " Mar 21 04:07:03 crc kubenswrapper[4685]: I0321 04:07:03.941234 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhlbn\" (UniqueName: \"kubernetes.io/projected/91653387-c2f1-4240-b710-e0c709eb769d-kube-api-access-nhlbn\") pod \"91653387-c2f1-4240-b710-e0c709eb769d\" (UID: \"91653387-c2f1-4240-b710-e0c709eb769d\") " Mar 21 04:07:03 crc kubenswrapper[4685]: I0321 04:07:03.941435 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91653387-c2f1-4240-b710-e0c709eb769d-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "91653387-c2f1-4240-b710-e0c709eb769d" (UID: "91653387-c2f1-4240-b710-e0c709eb769d"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:07:03 crc kubenswrapper[4685]: I0321 04:07:03.941553 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91653387-c2f1-4240-b710-e0c709eb769d-config-data" (OuterVolumeSpecName: "config-data") pod "91653387-c2f1-4240-b710-e0c709eb769d" (UID: "91653387-c2f1-4240-b710-e0c709eb769d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:07:03 crc kubenswrapper[4685]: I0321 04:07:03.942464 4685 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f79a543a-5faf-40e2-9854-c5c6f19f4cec-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:07:03 crc kubenswrapper[4685]: I0321 04:07:03.942490 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5ch62\" (UniqueName: \"kubernetes.io/projected/f79a543a-5faf-40e2-9854-c5c6f19f4cec-kube-api-access-5ch62\") on node \"crc\" DevicePath \"\"" Mar 21 04:07:03 crc kubenswrapper[4685]: I0321 04:07:03.942500 4685 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/91653387-c2f1-4240-b710-e0c709eb769d-kolla-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:07:03 crc kubenswrapper[4685]: I0321 04:07:03.942508 4685 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/91653387-c2f1-4240-b710-e0c709eb769d-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:07:03 crc kubenswrapper[4685]: I0321 04:07:03.944394 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91653387-c2f1-4240-b710-e0c709eb769d-kube-api-access-nhlbn" (OuterVolumeSpecName: "kube-api-access-nhlbn") pod "91653387-c2f1-4240-b710-e0c709eb769d" (UID: "91653387-c2f1-4240-b710-e0c709eb769d"). InnerVolumeSpecName "kube-api-access-nhlbn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:07:04 crc kubenswrapper[4685]: I0321 04:07:04.043701 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nhlbn\" (UniqueName: \"kubernetes.io/projected/91653387-c2f1-4240-b710-e0c709eb769d-kube-api-access-nhlbn\") on node \"crc\" DevicePath \"\"" Mar 21 04:07:04 crc kubenswrapper[4685]: I0321 04:07:04.311202 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ee610c4-8416-4d1c-a6b4-2324f1541b1c" path="/var/lib/kubelet/pods/9ee610c4-8416-4d1c-a6b4-2324f1541b1c/volumes" Mar 21 04:07:04 crc kubenswrapper[4685]: I0321 04:07:04.312055 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f79a543a-5faf-40e2-9854-c5c6f19f4cec" path="/var/lib/kubelet/pods/f79a543a-5faf-40e2-9854-c5c6f19f4cec/volumes" Mar 21 04:07:04 crc kubenswrapper[4685]: E0321 04:07:04.347601 4685 configmap.go:193] Couldn't get configMap barbican-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Mar 21 04:07:04 crc kubenswrapper[4685]: E0321 04:07:04.347685 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1f046207-c975-4417-89f1-650002978bca-operator-scripts podName:1f046207-c975-4417-89f1-650002978bca nodeName:}" failed. No retries permitted until 2026-03-21 04:07:06.347664236 +0000 UTC m=+1258.824733038 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/1f046207-c975-4417-89f1-650002978bca-operator-scripts") pod "keystonee186-account-delete-ppwv7" (UID: "1f046207-c975-4417-89f1-650002978bca") : configmap "openstack-scripts" not found Mar 21 04:07:04 crc kubenswrapper[4685]: I0321 04:07:04.381376 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-5589cf8c54-6qwrt"] Mar 21 04:07:04 crc kubenswrapper[4685]: I0321 04:07:04.381572 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/barbican-operator-controller-manager-5589cf8c54-6qwrt" podUID="50509c19-c2fa-4171-a5f8-e4d699a9062c" containerName="manager" containerID="cri-o://cbbd1a2ce1003d134cae1726505335a0eaf118ab38ccb1008f2e26952fb34710" gracePeriod=10 Mar 21 04:07:04 crc kubenswrapper[4685]: I0321 04:07:04.419958 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/openstack-galera-1" podUID="ba532d6b-607c-450f-adb7-8d4e14ff58e0" containerName="galera" containerID="cri-o://1bdf9e4791a3ad52a31be260ed2573c28a204e6cfe08eb84afa1701f1ae9ad1e" gracePeriod=28 Mar 21 04:07:04 crc kubenswrapper[4685]: I0321 04:07:04.613831 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/barbican-operator-index-m4x5b"] Mar 21 04:07:04 crc kubenswrapper[4685]: I0321 04:07:04.614552 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/barbican-operator-index-m4x5b" podUID="08b81240-20f5-499a-afd2-5666d0fa97e3" containerName="registry-server" containerID="cri-o://8096e48df5e824c20ac9ff8c44e5d00c0e44afcb986921a00eb143825932c6ef" gracePeriod=30 Mar 21 04:07:04 crc kubenswrapper[4685]: I0321 04:07:04.654661 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/dee043e15d25f12961d0325068fc60bde20860efbe3e725c5a2edfb26ch99b5"] Mar 21 04:07:04 crc kubenswrapper[4685]: I0321 04:07:04.661447 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/dee043e15d25f12961d0325068fc60bde20860efbe3e725c5a2edfb26ch99b5"] Mar 21 04:07:04 crc kubenswrapper[4685]: I0321 04:07:04.787327 4685 generic.go:334] "Generic (PLEG): container finished" podID="08b81240-20f5-499a-afd2-5666d0fa97e3" containerID="8096e48df5e824c20ac9ff8c44e5d00c0e44afcb986921a00eb143825932c6ef" exitCode=0 Mar 21 04:07:04 crc kubenswrapper[4685]: I0321 04:07:04.787397 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-index-m4x5b" event={"ID":"08b81240-20f5-499a-afd2-5666d0fa97e3","Type":"ContainerDied","Data":"8096e48df5e824c20ac9ff8c44e5d00c0e44afcb986921a00eb143825932c6ef"} Mar 21 04:07:04 crc kubenswrapper[4685]: I0321 04:07:04.798343 4685 generic.go:334] "Generic (PLEG): container finished" podID="50509c19-c2fa-4171-a5f8-e4d699a9062c" containerID="cbbd1a2ce1003d134cae1726505335a0eaf118ab38ccb1008f2e26952fb34710" exitCode=0 Mar 21 04:07:04 crc kubenswrapper[4685]: I0321 04:07:04.798441 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-5589cf8c54-6qwrt" event={"ID":"50509c19-c2fa-4171-a5f8-e4d699a9062c","Type":"ContainerDied","Data":"cbbd1a2ce1003d134cae1726505335a0eaf118ab38ccb1008f2e26952fb34710"} Mar 21 04:07:04 crc kubenswrapper[4685]: I0321 04:07:04.798467 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-5589cf8c54-6qwrt" event={"ID":"50509c19-c2fa-4171-a5f8-e4d699a9062c","Type":"ContainerDied","Data":"fb8084639e3d5227e099fa993d8ddeee89b3980388b42ba56904a8ce72cbc9a0"} Mar 21 04:07:04 crc kubenswrapper[4685]: I0321 04:07:04.798478 4685 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb8084639e3d5227e099fa993d8ddeee89b3980388b42ba56904a8ce72cbc9a0" Mar 21 04:07:04 crc kubenswrapper[4685]: I0321 04:07:04.806474 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/memcached-0" Mar 21 04:07:04 crc kubenswrapper[4685]: I0321 04:07:04.807757 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/memcached-0" event={"ID":"91653387-c2f1-4240-b710-e0c709eb769d","Type":"ContainerDied","Data":"029a15848e86b090e45728193416b05a946b75994bc4c7dbbeb230458a7ae8ea"} Mar 21 04:07:04 crc kubenswrapper[4685]: I0321 04:07:04.807803 4685 scope.go:117] "RemoveContainer" containerID="78e1e20967447cec136443726f8d2e76ac857183f77e3fec410e0f394f81503a" Mar 21 04:07:04 crc kubenswrapper[4685]: I0321 04:07:04.826635 4685 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="barbican-kuttl-tests/keystonee186-account-delete-ppwv7" secret="" err="secret \"galera-openstack-dockercfg-ntnkl\" not found" Mar 21 04:07:04 crc kubenswrapper[4685]: I0321 04:07:04.826701 4685 scope.go:117] "RemoveContainer" containerID="32ba97a86558db342b9a82184d606542f267144be2a0a73ed9ba300e6288ca0c" Mar 21 04:07:04 crc kubenswrapper[4685]: E0321 04:07:04.827024 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-delete\" with CrashLoopBackOff: \"back-off 10s restarting failed container=mariadb-account-delete pod=keystonee186-account-delete-ppwv7_barbican-kuttl-tests(1f046207-c975-4417-89f1-650002978bca)\"" pod="barbican-kuttl-tests/keystonee186-account-delete-ppwv7" podUID="1f046207-c975-4417-89f1-650002978bca" Mar 21 04:07:04 crc kubenswrapper[4685]: I0321 04:07:04.835898 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-5589cf8c54-6qwrt" Mar 21 04:07:04 crc kubenswrapper[4685]: I0321 04:07:04.844257 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/memcached-0"] Mar 21 04:07:04 crc kubenswrapper[4685]: I0321 04:07:04.889611 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/memcached-0"] Mar 21 04:07:04 crc kubenswrapper[4685]: I0321 04:07:04.954316 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/50509c19-c2fa-4171-a5f8-e4d699a9062c-webhook-cert\") pod \"50509c19-c2fa-4171-a5f8-e4d699a9062c\" (UID: \"50509c19-c2fa-4171-a5f8-e4d699a9062c\") " Mar 21 04:07:04 crc kubenswrapper[4685]: I0321 04:07:04.955099 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7c84\" (UniqueName: \"kubernetes.io/projected/50509c19-c2fa-4171-a5f8-e4d699a9062c-kube-api-access-l7c84\") pod \"50509c19-c2fa-4171-a5f8-e4d699a9062c\" (UID: \"50509c19-c2fa-4171-a5f8-e4d699a9062c\") " Mar 21 04:07:04 crc kubenswrapper[4685]: I0321 04:07:04.955215 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/50509c19-c2fa-4171-a5f8-e4d699a9062c-apiservice-cert\") pod \"50509c19-c2fa-4171-a5f8-e4d699a9062c\" (UID: \"50509c19-c2fa-4171-a5f8-e4d699a9062c\") " Mar 21 04:07:04 crc kubenswrapper[4685]: I0321 04:07:04.958722 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50509c19-c2fa-4171-a5f8-e4d699a9062c-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "50509c19-c2fa-4171-a5f8-e4d699a9062c" (UID: "50509c19-c2fa-4171-a5f8-e4d699a9062c"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:07:04 crc kubenswrapper[4685]: I0321 04:07:04.959019 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50509c19-c2fa-4171-a5f8-e4d699a9062c-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "50509c19-c2fa-4171-a5f8-e4d699a9062c" (UID: "50509c19-c2fa-4171-a5f8-e4d699a9062c"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:07:04 crc kubenswrapper[4685]: I0321 04:07:04.959202 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50509c19-c2fa-4171-a5f8-e4d699a9062c-kube-api-access-l7c84" (OuterVolumeSpecName: "kube-api-access-l7c84") pod "50509c19-c2fa-4171-a5f8-e4d699a9062c" (UID: "50509c19-c2fa-4171-a5f8-e4d699a9062c"). InnerVolumeSpecName "kube-api-access-l7c84". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:07:05 crc kubenswrapper[4685]: I0321 04:07:05.056360 4685 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/50509c19-c2fa-4171-a5f8-e4d699a9062c-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:07:05 crc kubenswrapper[4685]: I0321 04:07:05.056718 4685 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/50509c19-c2fa-4171-a5f8-e4d699a9062c-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:07:05 crc kubenswrapper[4685]: I0321 04:07:05.056731 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l7c84\" (UniqueName: \"kubernetes.io/projected/50509c19-c2fa-4171-a5f8-e4d699a9062c-kube-api-access-l7c84\") on node \"crc\" DevicePath \"\"" Mar 21 04:07:05 crc kubenswrapper[4685]: I0321 04:07:05.115362 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-index-m4x5b" Mar 21 04:07:05 crc kubenswrapper[4685]: I0321 04:07:05.158016 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-npzkd\" (UniqueName: \"kubernetes.io/projected/08b81240-20f5-499a-afd2-5666d0fa97e3-kube-api-access-npzkd\") pod \"08b81240-20f5-499a-afd2-5666d0fa97e3\" (UID: \"08b81240-20f5-499a-afd2-5666d0fa97e3\") " Mar 21 04:07:05 crc kubenswrapper[4685]: I0321 04:07:05.161301 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08b81240-20f5-499a-afd2-5666d0fa97e3-kube-api-access-npzkd" (OuterVolumeSpecName: "kube-api-access-npzkd") pod "08b81240-20f5-499a-afd2-5666d0fa97e3" (UID: "08b81240-20f5-499a-afd2-5666d0fa97e3"). InnerVolumeSpecName "kube-api-access-npzkd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:07:05 crc kubenswrapper[4685]: E0321 04:07:05.200554 4685 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1bdf9e4791a3ad52a31be260ed2573c28a204e6cfe08eb84afa1701f1ae9ad1e" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Mar 21 04:07:05 crc kubenswrapper[4685]: I0321 04:07:05.205149 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-8649f5b8f-mdp78" Mar 21 04:07:05 crc kubenswrapper[4685]: E0321 04:07:05.206114 4685 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1bdf9e4791a3ad52a31be260ed2573c28a204e6cfe08eb84afa1701f1ae9ad1e" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Mar 21 04:07:05 crc kubenswrapper[4685]: E0321 04:07:05.207677 4685 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1bdf9e4791a3ad52a31be260ed2573c28a204e6cfe08eb84afa1701f1ae9ad1e" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Mar 21 04:07:05 crc kubenswrapper[4685]: E0321 04:07:05.207726 4685 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="barbican-kuttl-tests/openstack-galera-1" podUID="ba532d6b-607c-450f-adb7-8d4e14ff58e0" containerName="galera" Mar 21 04:07:05 crc kubenswrapper[4685]: I0321 04:07:05.259167 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wzk7\" (UniqueName: \"kubernetes.io/projected/0b2b4768-f546-4fad-9609-8b01fa7749dc-kube-api-access-7wzk7\") pod \"0b2b4768-f546-4fad-9609-8b01fa7749dc\" (UID: \"0b2b4768-f546-4fad-9609-8b01fa7749dc\") " Mar 21 04:07:05 crc kubenswrapper[4685]: I0321 04:07:05.259213 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0b2b4768-f546-4fad-9609-8b01fa7749dc-credential-keys\") pod \"0b2b4768-f546-4fad-9609-8b01fa7749dc\" (UID: \"0b2b4768-f546-4fad-9609-8b01fa7749dc\") " Mar 21 04:07:05 crc kubenswrapper[4685]: I0321 04:07:05.259249 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b2b4768-f546-4fad-9609-8b01fa7749dc-scripts\") pod \"0b2b4768-f546-4fad-9609-8b01fa7749dc\" (UID: \"0b2b4768-f546-4fad-9609-8b01fa7749dc\") " Mar 21 04:07:05 crc kubenswrapper[4685]: I0321 04:07:05.259309 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0b2b4768-f546-4fad-9609-8b01fa7749dc-fernet-keys\") pod \"0b2b4768-f546-4fad-9609-8b01fa7749dc\" (UID: \"0b2b4768-f546-4fad-9609-8b01fa7749dc\") " Mar 21 04:07:05 crc kubenswrapper[4685]: I0321 04:07:05.259358 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b2b4768-f546-4fad-9609-8b01fa7749dc-config-data\") pod \"0b2b4768-f546-4fad-9609-8b01fa7749dc\" (UID: \"0b2b4768-f546-4fad-9609-8b01fa7749dc\") " Mar 21 04:07:05 crc kubenswrapper[4685]: I0321 04:07:05.259648 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-npzkd\" (UniqueName: \"kubernetes.io/projected/08b81240-20f5-499a-afd2-5666d0fa97e3-kube-api-access-npzkd\") on node \"crc\" DevicePath \"\"" Mar 21 04:07:05 crc kubenswrapper[4685]: I0321 04:07:05.262046 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b2b4768-f546-4fad-9609-8b01fa7749dc-kube-api-access-7wzk7" (OuterVolumeSpecName: "kube-api-access-7wzk7") pod "0b2b4768-f546-4fad-9609-8b01fa7749dc" (UID: "0b2b4768-f546-4fad-9609-8b01fa7749dc"). InnerVolumeSpecName "kube-api-access-7wzk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:07:05 crc kubenswrapper[4685]: I0321 04:07:05.262448 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b2b4768-f546-4fad-9609-8b01fa7749dc-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "0b2b4768-f546-4fad-9609-8b01fa7749dc" (UID: "0b2b4768-f546-4fad-9609-8b01fa7749dc"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:07:05 crc kubenswrapper[4685]: I0321 04:07:05.262692 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b2b4768-f546-4fad-9609-8b01fa7749dc-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "0b2b4768-f546-4fad-9609-8b01fa7749dc" (UID: "0b2b4768-f546-4fad-9609-8b01fa7749dc"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:07:05 crc kubenswrapper[4685]: I0321 04:07:05.263574 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b2b4768-f546-4fad-9609-8b01fa7749dc-scripts" (OuterVolumeSpecName: "scripts") pod "0b2b4768-f546-4fad-9609-8b01fa7749dc" (UID: "0b2b4768-f546-4fad-9609-8b01fa7749dc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:07:05 crc kubenswrapper[4685]: I0321 04:07:05.278420 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b2b4768-f546-4fad-9609-8b01fa7749dc-config-data" (OuterVolumeSpecName: "config-data") pod "0b2b4768-f546-4fad-9609-8b01fa7749dc" (UID: "0b2b4768-f546-4fad-9609-8b01fa7749dc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:07:05 crc kubenswrapper[4685]: I0321 04:07:05.280528 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/rabbitmq-server-0" Mar 21 04:07:05 crc kubenswrapper[4685]: I0321 04:07:05.360257 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d18693b0-ea2a-4795-a4de-15a379cc8490-rabbitmq-erlang-cookie\") pod \"d18693b0-ea2a-4795-a4de-15a379cc8490\" (UID: \"d18693b0-ea2a-4795-a4de-15a379cc8490\") " Mar 21 04:07:05 crc kubenswrapper[4685]: I0321 04:07:05.360300 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qgdkb\" (UniqueName: \"kubernetes.io/projected/d18693b0-ea2a-4795-a4de-15a379cc8490-kube-api-access-qgdkb\") pod \"d18693b0-ea2a-4795-a4de-15a379cc8490\" (UID: \"d18693b0-ea2a-4795-a4de-15a379cc8490\") " Mar 21 04:07:05 crc kubenswrapper[4685]: I0321 04:07:05.360331 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d18693b0-ea2a-4795-a4de-15a379cc8490-rabbitmq-plugins\") pod \"d18693b0-ea2a-4795-a4de-15a379cc8490\" (UID: \"d18693b0-ea2a-4795-a4de-15a379cc8490\") " Mar 21 04:07:05 crc kubenswrapper[4685]: I0321 04:07:05.360471 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cd7b19e4-2e04-41ca-a42e-dec51f7bbe94\") pod \"d18693b0-ea2a-4795-a4de-15a379cc8490\" (UID: \"d18693b0-ea2a-4795-a4de-15a379cc8490\") " Mar 21 04:07:05 crc kubenswrapper[4685]: I0321 04:07:05.360500 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d18693b0-ea2a-4795-a4de-15a379cc8490-pod-info\") pod \"d18693b0-ea2a-4795-a4de-15a379cc8490\" (UID: \"d18693b0-ea2a-4795-a4de-15a379cc8490\") " Mar 21 04:07:05 crc kubenswrapper[4685]: I0321 04:07:05.360545 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d18693b0-ea2a-4795-a4de-15a379cc8490-rabbitmq-confd\") pod \"d18693b0-ea2a-4795-a4de-15a379cc8490\" (UID: \"d18693b0-ea2a-4795-a4de-15a379cc8490\") " Mar 21 04:07:05 crc kubenswrapper[4685]: I0321 04:07:05.360589 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d18693b0-ea2a-4795-a4de-15a379cc8490-plugins-conf\") pod \"d18693b0-ea2a-4795-a4de-15a379cc8490\" (UID: \"d18693b0-ea2a-4795-a4de-15a379cc8490\") " Mar 21 04:07:05 crc kubenswrapper[4685]: I0321 04:07:05.360610 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d18693b0-ea2a-4795-a4de-15a379cc8490-erlang-cookie-secret\") pod \"d18693b0-ea2a-4795-a4de-15a379cc8490\" (UID: \"d18693b0-ea2a-4795-a4de-15a379cc8490\") " Mar 21 04:07:05 crc kubenswrapper[4685]: I0321 04:07:05.360866 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d18693b0-ea2a-4795-a4de-15a379cc8490-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "d18693b0-ea2a-4795-a4de-15a379cc8490" (UID: "d18693b0-ea2a-4795-a4de-15a379cc8490"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:07:05 crc kubenswrapper[4685]: I0321 04:07:05.360923 4685 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0b2b4768-f546-4fad-9609-8b01fa7749dc-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 21 04:07:05 crc kubenswrapper[4685]: I0321 04:07:05.360938 4685 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b2b4768-f546-4fad-9609-8b01fa7749dc-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:07:05 crc kubenswrapper[4685]: I0321 04:07:05.360950 4685 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0b2b4768-f546-4fad-9609-8b01fa7749dc-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 21 04:07:05 crc kubenswrapper[4685]: I0321 04:07:05.360964 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wzk7\" (UniqueName: \"kubernetes.io/projected/0b2b4768-f546-4fad-9609-8b01fa7749dc-kube-api-access-7wzk7\") on node \"crc\" DevicePath \"\"" Mar 21 04:07:05 crc kubenswrapper[4685]: I0321 04:07:05.360972 4685 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b2b4768-f546-4fad-9609-8b01fa7749dc-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:07:05 crc kubenswrapper[4685]: I0321 04:07:05.361581 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d18693b0-ea2a-4795-a4de-15a379cc8490-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "d18693b0-ea2a-4795-a4de-15a379cc8490" (UID: "d18693b0-ea2a-4795-a4de-15a379cc8490"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:07:05 crc kubenswrapper[4685]: I0321 04:07:05.361623 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d18693b0-ea2a-4795-a4de-15a379cc8490-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "d18693b0-ea2a-4795-a4de-15a379cc8490" (UID: "d18693b0-ea2a-4795-a4de-15a379cc8490"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:07:05 crc kubenswrapper[4685]: I0321 04:07:05.364271 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d18693b0-ea2a-4795-a4de-15a379cc8490-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "d18693b0-ea2a-4795-a4de-15a379cc8490" (UID: "d18693b0-ea2a-4795-a4de-15a379cc8490"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:07:05 crc kubenswrapper[4685]: I0321 04:07:05.364367 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d18693b0-ea2a-4795-a4de-15a379cc8490-kube-api-access-qgdkb" (OuterVolumeSpecName: "kube-api-access-qgdkb") pod "d18693b0-ea2a-4795-a4de-15a379cc8490" (UID: "d18693b0-ea2a-4795-a4de-15a379cc8490"). InnerVolumeSpecName "kube-api-access-qgdkb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:07:05 crc kubenswrapper[4685]: I0321 04:07:05.364675 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/d18693b0-ea2a-4795-a4de-15a379cc8490-pod-info" (OuterVolumeSpecName: "pod-info") pod "d18693b0-ea2a-4795-a4de-15a379cc8490" (UID: "d18693b0-ea2a-4795-a4de-15a379cc8490"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 21 04:07:05 crc kubenswrapper[4685]: I0321 04:07:05.368853 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cd7b19e4-2e04-41ca-a42e-dec51f7bbe94" (OuterVolumeSpecName: "persistence") pod "d18693b0-ea2a-4795-a4de-15a379cc8490" (UID: "d18693b0-ea2a-4795-a4de-15a379cc8490"). InnerVolumeSpecName "pvc-cd7b19e4-2e04-41ca-a42e-dec51f7bbe94". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 21 04:07:05 crc kubenswrapper[4685]: I0321 04:07:05.416855 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d18693b0-ea2a-4795-a4de-15a379cc8490-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "d18693b0-ea2a-4795-a4de-15a379cc8490" (UID: "d18693b0-ea2a-4795-a4de-15a379cc8490"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:07:05 crc kubenswrapper[4685]: I0321 04:07:05.461921 4685 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d18693b0-ea2a-4795-a4de-15a379cc8490-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 21 04:07:05 crc kubenswrapper[4685]: I0321 04:07:05.462331 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qgdkb\" (UniqueName: \"kubernetes.io/projected/d18693b0-ea2a-4795-a4de-15a379cc8490-kube-api-access-qgdkb\") on node \"crc\" DevicePath \"\"" Mar 21 04:07:05 crc kubenswrapper[4685]: I0321 04:07:05.462344 4685 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d18693b0-ea2a-4795-a4de-15a379cc8490-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 21 04:07:05 crc kubenswrapper[4685]: I0321 04:07:05.462386 4685 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-cd7b19e4-2e04-41ca-a42e-dec51f7bbe94\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cd7b19e4-2e04-41ca-a42e-dec51f7bbe94\") on node \"crc\" " Mar 21 04:07:05 crc kubenswrapper[4685]: I0321 04:07:05.462404 4685 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d18693b0-ea2a-4795-a4de-15a379cc8490-pod-info\") on node \"crc\" DevicePath \"\"" Mar 21 04:07:05 crc kubenswrapper[4685]: I0321 04:07:05.462418 4685 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d18693b0-ea2a-4795-a4de-15a379cc8490-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 21 04:07:05 crc kubenswrapper[4685]: I0321 04:07:05.462428 4685 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d18693b0-ea2a-4795-a4de-15a379cc8490-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 21 04:07:05 crc kubenswrapper[4685]: I0321 04:07:05.462439 4685 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d18693b0-ea2a-4795-a4de-15a379cc8490-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 21 04:07:05 crc kubenswrapper[4685]: I0321 04:07:05.478898 4685 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 21 04:07:05 crc kubenswrapper[4685]: I0321 04:07:05.479075 4685 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-cd7b19e4-2e04-41ca-a42e-dec51f7bbe94" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cd7b19e4-2e04-41ca-a42e-dec51f7bbe94") on node "crc" Mar 21 04:07:05 crc kubenswrapper[4685]: I0321 04:07:05.563791 4685 reconciler_common.go:293] "Volume detached for volume \"pvc-cd7b19e4-2e04-41ca-a42e-dec51f7bbe94\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cd7b19e4-2e04-41ca-a42e-dec51f7bbe94\") on node \"crc\" DevicePath \"\"" Mar 21 04:07:05 crc kubenswrapper[4685]: I0321 04:07:05.841724 4685 generic.go:334] "Generic (PLEG): container finished" podID="d18693b0-ea2a-4795-a4de-15a379cc8490" containerID="d744b99d0f599149431f3be36e7e73a2b55609a1d321ca7ef292c0e1275ec425" exitCode=0 Mar 21 04:07:05 crc kubenswrapper[4685]: I0321 04:07:05.841775 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/rabbitmq-server-0" Mar 21 04:07:05 crc kubenswrapper[4685]: I0321 04:07:05.841895 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/rabbitmq-server-0" event={"ID":"d18693b0-ea2a-4795-a4de-15a379cc8490","Type":"ContainerDied","Data":"d744b99d0f599149431f3be36e7e73a2b55609a1d321ca7ef292c0e1275ec425"} Mar 21 04:07:05 crc kubenswrapper[4685]: I0321 04:07:05.841970 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/rabbitmq-server-0" event={"ID":"d18693b0-ea2a-4795-a4de-15a379cc8490","Type":"ContainerDied","Data":"2f5496e15c04ecc25ede871af6a6220aaba433c9942d07d8afbf0f7ad00c2d45"} Mar 21 04:07:05 crc kubenswrapper[4685]: I0321 04:07:05.842022 4685 scope.go:117] "RemoveContainer" containerID="d744b99d0f599149431f3be36e7e73a2b55609a1d321ca7ef292c0e1275ec425" Mar 21 04:07:05 crc kubenswrapper[4685]: I0321 04:07:05.843883 4685 generic.go:334] "Generic (PLEG): container finished" podID="0b2b4768-f546-4fad-9609-8b01fa7749dc" containerID="22d93438459a6e12b158d3b08c3f4ece277363dc0de50210f3185acb8116f22d" exitCode=0 Mar 21 04:07:05 crc kubenswrapper[4685]: I0321 04:07:05.843928 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-8649f5b8f-mdp78" event={"ID":"0b2b4768-f546-4fad-9609-8b01fa7749dc","Type":"ContainerDied","Data":"22d93438459a6e12b158d3b08c3f4ece277363dc0de50210f3185acb8116f22d"} Mar 21 04:07:05 crc kubenswrapper[4685]: I0321 04:07:05.843961 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-8649f5b8f-mdp78" event={"ID":"0b2b4768-f546-4fad-9609-8b01fa7749dc","Type":"ContainerDied","Data":"195a642aaffcc74f1c4f45bc57d66bac5c1dbd31403b1147e31b3cbf02f60cbd"} Mar 21 04:07:05 crc kubenswrapper[4685]: I0321 04:07:05.843911 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-8649f5b8f-mdp78" Mar 21 04:07:05 crc kubenswrapper[4685]: I0321 04:07:05.849450 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-index-m4x5b" Mar 21 04:07:05 crc kubenswrapper[4685]: I0321 04:07:05.850674 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-index-m4x5b" event={"ID":"08b81240-20f5-499a-afd2-5666d0fa97e3","Type":"ContainerDied","Data":"f1091d9223c67499bfb71dfe286ed0918c22d46b2490ed927983f03f8a206990"} Mar 21 04:07:05 crc kubenswrapper[4685]: I0321 04:07:05.852352 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-5589cf8c54-6qwrt" Mar 21 04:07:05 crc kubenswrapper[4685]: I0321 04:07:05.978740 4685 scope.go:117] "RemoveContainer" containerID="0b5a6c171308ac60c554fab568debabcf1c313fb15eededa7eb38e4bb375eb1b" Mar 21 04:07:05 crc kubenswrapper[4685]: I0321 04:07:05.989098 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/rabbitmq-server-0"] Mar 21 04:07:06 crc kubenswrapper[4685]: I0321 04:07:06.001459 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/rabbitmq-server-0"] Mar 21 04:07:06 crc kubenswrapper[4685]: I0321 04:07:06.017412 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-5589cf8c54-6qwrt"] Mar 21 04:07:06 crc kubenswrapper[4685]: I0321 04:07:06.023442 4685 scope.go:117] "RemoveContainer" containerID="d744b99d0f599149431f3be36e7e73a2b55609a1d321ca7ef292c0e1275ec425" Mar 21 04:07:06 crc kubenswrapper[4685]: E0321 04:07:06.024240 4685 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d744b99d0f599149431f3be36e7e73a2b55609a1d321ca7ef292c0e1275ec425\": container with ID starting with d744b99d0f599149431f3be36e7e73a2b55609a1d321ca7ef292c0e1275ec425 not found: ID does not exist" containerID="d744b99d0f599149431f3be36e7e73a2b55609a1d321ca7ef292c0e1275ec425" Mar 21 04:07:06 crc kubenswrapper[4685]: I0321 04:07:06.024280 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d744b99d0f599149431f3be36e7e73a2b55609a1d321ca7ef292c0e1275ec425"} err="failed to get container status \"d744b99d0f599149431f3be36e7e73a2b55609a1d321ca7ef292c0e1275ec425\": rpc error: code = NotFound desc = could not find container \"d744b99d0f599149431f3be36e7e73a2b55609a1d321ca7ef292c0e1275ec425\": container with ID starting with d744b99d0f599149431f3be36e7e73a2b55609a1d321ca7ef292c0e1275ec425 not found: ID does not exist" Mar 21 04:07:06 crc kubenswrapper[4685]: I0321 04:07:06.024305 4685 scope.go:117] "RemoveContainer" containerID="0b5a6c171308ac60c554fab568debabcf1c313fb15eededa7eb38e4bb375eb1b" Mar 21 04:07:06 crc kubenswrapper[4685]: E0321 04:07:06.025934 4685 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b5a6c171308ac60c554fab568debabcf1c313fb15eededa7eb38e4bb375eb1b\": container with ID starting with 0b5a6c171308ac60c554fab568debabcf1c313fb15eededa7eb38e4bb375eb1b not found: ID does not exist" containerID="0b5a6c171308ac60c554fab568debabcf1c313fb15eededa7eb38e4bb375eb1b" Mar 21 04:07:06 crc kubenswrapper[4685]: I0321 04:07:06.025964 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b5a6c171308ac60c554fab568debabcf1c313fb15eededa7eb38e4bb375eb1b"} err="failed to get container status \"0b5a6c171308ac60c554fab568debabcf1c313fb15eededa7eb38e4bb375eb1b\": rpc error: code = NotFound desc = could not find container \"0b5a6c171308ac60c554fab568debabcf1c313fb15eededa7eb38e4bb375eb1b\": container with ID starting with 0b5a6c171308ac60c554fab568debabcf1c313fb15eededa7eb38e4bb375eb1b not found: ID does not exist" Mar 21 04:07:06 crc kubenswrapper[4685]: I0321 04:07:06.025979 4685 scope.go:117] "RemoveContainer" containerID="22d93438459a6e12b158d3b08c3f4ece277363dc0de50210f3185acb8116f22d" Mar 21 04:07:06 crc kubenswrapper[4685]: I0321 04:07:06.028754 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-5589cf8c54-6qwrt"] Mar 21 04:07:06 crc kubenswrapper[4685]: I0321 04:07:06.043290 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/keystone-8649f5b8f-mdp78"] Mar 21 04:07:06 crc kubenswrapper[4685]: I0321 04:07:06.047713 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/keystone-8649f5b8f-mdp78"] Mar 21 04:07:06 crc kubenswrapper[4685]: I0321 04:07:06.052362 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/barbican-operator-index-m4x5b"] Mar 21 04:07:06 crc kubenswrapper[4685]: I0321 04:07:06.055869 4685 scope.go:117] "RemoveContainer" containerID="22d93438459a6e12b158d3b08c3f4ece277363dc0de50210f3185acb8116f22d" Mar 21 04:07:06 crc kubenswrapper[4685]: E0321 04:07:06.056345 4685 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22d93438459a6e12b158d3b08c3f4ece277363dc0de50210f3185acb8116f22d\": container with ID starting with 22d93438459a6e12b158d3b08c3f4ece277363dc0de50210f3185acb8116f22d not found: ID does not exist" containerID="22d93438459a6e12b158d3b08c3f4ece277363dc0de50210f3185acb8116f22d" Mar 21 04:07:06 crc kubenswrapper[4685]: I0321 04:07:06.056489 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22d93438459a6e12b158d3b08c3f4ece277363dc0de50210f3185acb8116f22d"} err="failed to get container status \"22d93438459a6e12b158d3b08c3f4ece277363dc0de50210f3185acb8116f22d\": rpc error: code = NotFound desc = could not find container \"22d93438459a6e12b158d3b08c3f4ece277363dc0de50210f3185acb8116f22d\": container with ID starting with 22d93438459a6e12b158d3b08c3f4ece277363dc0de50210f3185acb8116f22d not found: ID does not exist" Mar 21 04:07:06 crc kubenswrapper[4685]: I0321 04:07:06.056555 4685 scope.go:117] "RemoveContainer" containerID="8096e48df5e824c20ac9ff8c44e5d00c0e44afcb986921a00eb143825932c6ef" Mar 21 04:07:06 crc kubenswrapper[4685]: I0321 04:07:06.056635 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/barbican-operator-index-m4x5b"] Mar 21 04:07:06 crc kubenswrapper[4685]: I0321 04:07:06.217865 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/openstack-galera-1" Mar 21 04:07:06 crc kubenswrapper[4685]: I0321 04:07:06.271233 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ba532d6b-607c-450f-adb7-8d4e14ff58e0-kolla-config\") pod \"ba532d6b-607c-450f-adb7-8d4e14ff58e0\" (UID: \"ba532d6b-607c-450f-adb7-8d4e14ff58e0\") " Mar 21 04:07:06 crc kubenswrapper[4685]: I0321 04:07:06.271383 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jzqbf\" (UniqueName: \"kubernetes.io/projected/ba532d6b-607c-450f-adb7-8d4e14ff58e0-kube-api-access-jzqbf\") pod \"ba532d6b-607c-450f-adb7-8d4e14ff58e0\" (UID: \"ba532d6b-607c-450f-adb7-8d4e14ff58e0\") " Mar 21 04:07:06 crc kubenswrapper[4685]: I0321 04:07:06.271425 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba532d6b-607c-450f-adb7-8d4e14ff58e0-operator-scripts\") pod \"ba532d6b-607c-450f-adb7-8d4e14ff58e0\" (UID: \"ba532d6b-607c-450f-adb7-8d4e14ff58e0\") " Mar 21 04:07:06 crc kubenswrapper[4685]: I0321 04:07:06.271462 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ba532d6b-607c-450f-adb7-8d4e14ff58e0-config-data-default\") pod \"ba532d6b-607c-450f-adb7-8d4e14ff58e0\" (UID: \"ba532d6b-607c-450f-adb7-8d4e14ff58e0\") " Mar 21 04:07:06 crc kubenswrapper[4685]: I0321 04:07:06.271489 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ba532d6b-607c-450f-adb7-8d4e14ff58e0\" (UID: \"ba532d6b-607c-450f-adb7-8d4e14ff58e0\") " Mar 21 04:07:06 crc kubenswrapper[4685]: I0321 04:07:06.271531 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ba532d6b-607c-450f-adb7-8d4e14ff58e0-config-data-generated\") pod \"ba532d6b-607c-450f-adb7-8d4e14ff58e0\" (UID: \"ba532d6b-607c-450f-adb7-8d4e14ff58e0\") " Mar 21 04:07:06 crc kubenswrapper[4685]: I0321 04:07:06.271880 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba532d6b-607c-450f-adb7-8d4e14ff58e0-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "ba532d6b-607c-450f-adb7-8d4e14ff58e0" (UID: "ba532d6b-607c-450f-adb7-8d4e14ff58e0"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:07:06 crc kubenswrapper[4685]: I0321 04:07:06.272098 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba532d6b-607c-450f-adb7-8d4e14ff58e0-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "ba532d6b-607c-450f-adb7-8d4e14ff58e0" (UID: "ba532d6b-607c-450f-adb7-8d4e14ff58e0"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:07:06 crc kubenswrapper[4685]: I0321 04:07:06.272122 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba532d6b-607c-450f-adb7-8d4e14ff58e0-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "ba532d6b-607c-450f-adb7-8d4e14ff58e0" (UID: "ba532d6b-607c-450f-adb7-8d4e14ff58e0"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:07:06 crc kubenswrapper[4685]: I0321 04:07:06.272611 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba532d6b-607c-450f-adb7-8d4e14ff58e0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ba532d6b-607c-450f-adb7-8d4e14ff58e0" (UID: "ba532d6b-607c-450f-adb7-8d4e14ff58e0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:07:06 crc kubenswrapper[4685]: I0321 04:07:06.274485 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba532d6b-607c-450f-adb7-8d4e14ff58e0-kube-api-access-jzqbf" (OuterVolumeSpecName: "kube-api-access-jzqbf") pod "ba532d6b-607c-450f-adb7-8d4e14ff58e0" (UID: "ba532d6b-607c-450f-adb7-8d4e14ff58e0"). InnerVolumeSpecName "kube-api-access-jzqbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:07:06 crc kubenswrapper[4685]: I0321 04:07:06.279981 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "mysql-db") pod "ba532d6b-607c-450f-adb7-8d4e14ff58e0" (UID: "ba532d6b-607c-450f-adb7-8d4e14ff58e0"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 21 04:07:06 crc kubenswrapper[4685]: I0321 04:07:06.308042 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08b81240-20f5-499a-afd2-5666d0fa97e3" path="/var/lib/kubelet/pods/08b81240-20f5-499a-afd2-5666d0fa97e3/volumes" Mar 21 04:07:06 crc kubenswrapper[4685]: I0321 04:07:06.308891 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b2b4768-f546-4fad-9609-8b01fa7749dc" path="/var/lib/kubelet/pods/0b2b4768-f546-4fad-9609-8b01fa7749dc/volumes" Mar 21 04:07:06 crc kubenswrapper[4685]: I0321 04:07:06.309591 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50509c19-c2fa-4171-a5f8-e4d699a9062c" path="/var/lib/kubelet/pods/50509c19-c2fa-4171-a5f8-e4d699a9062c/volumes" Mar 21 04:07:06 crc kubenswrapper[4685]: I0321 04:07:06.310263 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91653387-c2f1-4240-b710-e0c709eb769d" path="/var/lib/kubelet/pods/91653387-c2f1-4240-b710-e0c709eb769d/volumes" Mar 21 04:07:06 crc kubenswrapper[4685]: I0321 04:07:06.311734 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7d82eb2-9e3c-4a49-b3d5-3e66eb5f72d2" path="/var/lib/kubelet/pods/a7d82eb2-9e3c-4a49-b3d5-3e66eb5f72d2/volumes" Mar 21 04:07:06 crc kubenswrapper[4685]: I0321 04:07:06.312821 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d18693b0-ea2a-4795-a4de-15a379cc8490" path="/var/lib/kubelet/pods/d18693b0-ea2a-4795-a4de-15a379cc8490/volumes" Mar 21 04:07:06 crc kubenswrapper[4685]: E0321 04:07:06.373484 4685 configmap.go:193] Couldn't get configMap barbican-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Mar 21 04:07:06 crc kubenswrapper[4685]: E0321 04:07:06.373556 4685 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1f046207-c975-4417-89f1-650002978bca-operator-scripts podName:1f046207-c975-4417-89f1-650002978bca nodeName:}" failed. No retries permitted until 2026-03-21 04:07:10.373537599 +0000 UTC m=+1262.850606391 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/1f046207-c975-4417-89f1-650002978bca-operator-scripts") pod "keystonee186-account-delete-ppwv7" (UID: "1f046207-c975-4417-89f1-650002978bca") : configmap "openstack-scripts" not found Mar 21 04:07:06 crc kubenswrapper[4685]: I0321 04:07:06.373744 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jzqbf\" (UniqueName: \"kubernetes.io/projected/ba532d6b-607c-450f-adb7-8d4e14ff58e0-kube-api-access-jzqbf\") on node \"crc\" DevicePath \"\"" Mar 21 04:07:06 crc kubenswrapper[4685]: I0321 04:07:06.373757 4685 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba532d6b-607c-450f-adb7-8d4e14ff58e0-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:07:06 crc kubenswrapper[4685]: I0321 04:07:06.373766 4685 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ba532d6b-607c-450f-adb7-8d4e14ff58e0-config-data-default\") on node \"crc\" DevicePath \"\"" Mar 21 04:07:06 crc kubenswrapper[4685]: I0321 04:07:06.373783 4685 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Mar 21 04:07:06 crc kubenswrapper[4685]: I0321 04:07:06.373795 4685 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ba532d6b-607c-450f-adb7-8d4e14ff58e0-config-data-generated\") on node \"crc\" DevicePath \"\"" Mar 21 04:07:06 crc kubenswrapper[4685]: I0321 04:07:06.373803 4685 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ba532d6b-607c-450f-adb7-8d4e14ff58e0-kolla-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:07:06 crc kubenswrapper[4685]: I0321 04:07:06.384007 4685 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Mar 21 04:07:06 crc kubenswrapper[4685]: I0321 04:07:06.474345 4685 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Mar 21 04:07:06 crc kubenswrapper[4685]: I0321 04:07:06.500719 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/openstack-galera-0" podUID="a69263a8-bd4d-476c-99fc-f1202f36f8a0" containerName="galera" containerID="cri-o://6ef5c5d3b88cab93dc79845560aae01f51beff644d970c48e0096ebed008e109" gracePeriod=26 Mar 21 04:07:06 crc kubenswrapper[4685]: I0321 04:07:06.591666 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/keystone-db-create-27vgk"] Mar 21 04:07:06 crc kubenswrapper[4685]: I0321 04:07:06.596552 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/keystone-db-create-27vgk"] Mar 21 04:07:06 crc kubenswrapper[4685]: I0321 04:07:06.609995 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/keystone-e186-account-create-update-b7zr6"] Mar 21 04:07:06 crc kubenswrapper[4685]: I0321 04:07:06.616580 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/keystonee186-account-delete-ppwv7"] Mar 21 04:07:06 crc kubenswrapper[4685]: I0321 04:07:06.621417 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/keystone-e186-account-create-update-b7zr6"] Mar 21 04:07:06 crc kubenswrapper[4685]: I0321 04:07:06.817528 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystonee186-account-delete-ppwv7" Mar 21 04:07:06 crc kubenswrapper[4685]: I0321 04:07:06.873528 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystonee186-account-delete-ppwv7" Mar 21 04:07:06 crc kubenswrapper[4685]: I0321 04:07:06.873531 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystonee186-account-delete-ppwv7" event={"ID":"1f046207-c975-4417-89f1-650002978bca","Type":"ContainerDied","Data":"0579e74b9950502a977d96d5ac9eb3bf8539add74541b0f08d58fa5744050c41"} Mar 21 04:07:06 crc kubenswrapper[4685]: I0321 04:07:06.873698 4685 scope.go:117] "RemoveContainer" containerID="32ba97a86558db342b9a82184d606542f267144be2a0a73ed9ba300e6288ca0c" Mar 21 04:07:06 crc kubenswrapper[4685]: I0321 04:07:06.880287 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5klpq\" (UniqueName: \"kubernetes.io/projected/1f046207-c975-4417-89f1-650002978bca-kube-api-access-5klpq\") pod \"1f046207-c975-4417-89f1-650002978bca\" (UID: \"1f046207-c975-4417-89f1-650002978bca\") " Mar 21 04:07:06 crc kubenswrapper[4685]: I0321 04:07:06.880429 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f046207-c975-4417-89f1-650002978bca-operator-scripts\") pod \"1f046207-c975-4417-89f1-650002978bca\" (UID: \"1f046207-c975-4417-89f1-650002978bca\") " Mar 21 04:07:06 crc kubenswrapper[4685]: I0321 04:07:06.881119 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f046207-c975-4417-89f1-650002978bca-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1f046207-c975-4417-89f1-650002978bca" (UID: "1f046207-c975-4417-89f1-650002978bca"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:07:06 crc kubenswrapper[4685]: I0321 04:07:06.881977 4685 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f046207-c975-4417-89f1-650002978bca-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:07:06 crc kubenswrapper[4685]: I0321 04:07:06.899218 4685 generic.go:334] "Generic (PLEG): container finished" podID="ba532d6b-607c-450f-adb7-8d4e14ff58e0" containerID="1bdf9e4791a3ad52a31be260ed2573c28a204e6cfe08eb84afa1701f1ae9ad1e" exitCode=0 Mar 21 04:07:06 crc kubenswrapper[4685]: I0321 04:07:06.899305 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-1" event={"ID":"ba532d6b-607c-450f-adb7-8d4e14ff58e0","Type":"ContainerDied","Data":"1bdf9e4791a3ad52a31be260ed2573c28a204e6cfe08eb84afa1701f1ae9ad1e"} Mar 21 04:07:06 crc kubenswrapper[4685]: I0321 04:07:06.899334 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-1" event={"ID":"ba532d6b-607c-450f-adb7-8d4e14ff58e0","Type":"ContainerDied","Data":"6ce6283a7b17cf5db4ce52dd0c93515b94c32f6a225cdc2a9ea154d431d6c3c4"} Mar 21 04:07:06 crc kubenswrapper[4685]: I0321 04:07:06.899349 4685 scope.go:117] "RemoveContainer" containerID="1bdf9e4791a3ad52a31be260ed2573c28a204e6cfe08eb84afa1701f1ae9ad1e" Mar 21 04:07:06 crc kubenswrapper[4685]: I0321 04:07:06.899499 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/openstack-galera-1" Mar 21 04:07:06 crc kubenswrapper[4685]: I0321 04:07:06.900808 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f046207-c975-4417-89f1-650002978bca-kube-api-access-5klpq" (OuterVolumeSpecName: "kube-api-access-5klpq") pod "1f046207-c975-4417-89f1-650002978bca" (UID: "1f046207-c975-4417-89f1-650002978bca"). InnerVolumeSpecName "kube-api-access-5klpq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:07:06 crc kubenswrapper[4685]: I0321 04:07:06.988561 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5klpq\" (UniqueName: \"kubernetes.io/projected/1f046207-c975-4417-89f1-650002978bca-kube-api-access-5klpq\") on node \"crc\" DevicePath \"\"" Mar 21 04:07:07 crc kubenswrapper[4685]: I0321 04:07:07.016124 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/openstack-galera-1"] Mar 21 04:07:07 crc kubenswrapper[4685]: I0321 04:07:07.017464 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/openstack-galera-1"] Mar 21 04:07:07 crc kubenswrapper[4685]: I0321 04:07:07.036181 4685 scope.go:117] "RemoveContainer" containerID="571f7affd534842985c2e3a136e5f111bc11cbc05ccb19a322fe4c8369044e8d" Mar 21 04:07:07 crc kubenswrapper[4685]: I0321 04:07:07.061770 4685 scope.go:117] "RemoveContainer" containerID="1bdf9e4791a3ad52a31be260ed2573c28a204e6cfe08eb84afa1701f1ae9ad1e" Mar 21 04:07:07 crc kubenswrapper[4685]: E0321 04:07:07.062128 4685 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bdf9e4791a3ad52a31be260ed2573c28a204e6cfe08eb84afa1701f1ae9ad1e\": container with ID starting with 1bdf9e4791a3ad52a31be260ed2573c28a204e6cfe08eb84afa1701f1ae9ad1e not found: ID does not exist" containerID="1bdf9e4791a3ad52a31be260ed2573c28a204e6cfe08eb84afa1701f1ae9ad1e" Mar 21 04:07:07 crc kubenswrapper[4685]: I0321 04:07:07.062165 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bdf9e4791a3ad52a31be260ed2573c28a204e6cfe08eb84afa1701f1ae9ad1e"} err="failed to get container status \"1bdf9e4791a3ad52a31be260ed2573c28a204e6cfe08eb84afa1701f1ae9ad1e\": rpc error: code = NotFound desc = could not find container \"1bdf9e4791a3ad52a31be260ed2573c28a204e6cfe08eb84afa1701f1ae9ad1e\": container with ID starting with 1bdf9e4791a3ad52a31be260ed2573c28a204e6cfe08eb84afa1701f1ae9ad1e not found: ID does not exist" Mar 21 04:07:07 crc kubenswrapper[4685]: I0321 04:07:07.062190 4685 scope.go:117] "RemoveContainer" containerID="571f7affd534842985c2e3a136e5f111bc11cbc05ccb19a322fe4c8369044e8d" Mar 21 04:07:07 crc kubenswrapper[4685]: E0321 04:07:07.062593 4685 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"571f7affd534842985c2e3a136e5f111bc11cbc05ccb19a322fe4c8369044e8d\": container with ID starting with 571f7affd534842985c2e3a136e5f111bc11cbc05ccb19a322fe4c8369044e8d not found: ID does not exist" containerID="571f7affd534842985c2e3a136e5f111bc11cbc05ccb19a322fe4c8369044e8d" Mar 21 04:07:07 crc kubenswrapper[4685]: I0321 04:07:07.062810 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"571f7affd534842985c2e3a136e5f111bc11cbc05ccb19a322fe4c8369044e8d"} err="failed to get container status \"571f7affd534842985c2e3a136e5f111bc11cbc05ccb19a322fe4c8369044e8d\": rpc error: code = NotFound desc = could not find container \"571f7affd534842985c2e3a136e5f111bc11cbc05ccb19a322fe4c8369044e8d\": container with ID starting with 571f7affd534842985c2e3a136e5f111bc11cbc05ccb19a322fe4c8369044e8d not found: ID does not exist" Mar 21 04:07:07 crc kubenswrapper[4685]: I0321 04:07:07.115567 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/openstack-galera-0" Mar 21 04:07:07 crc kubenswrapper[4685]: I0321 04:07:07.191518 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"a69263a8-bd4d-476c-99fc-f1202f36f8a0\" (UID: \"a69263a8-bd4d-476c-99fc-f1202f36f8a0\") " Mar 21 04:07:07 crc kubenswrapper[4685]: I0321 04:07:07.191681 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a69263a8-bd4d-476c-99fc-f1202f36f8a0-kolla-config\") pod \"a69263a8-bd4d-476c-99fc-f1202f36f8a0\" (UID: \"a69263a8-bd4d-476c-99fc-f1202f36f8a0\") " Mar 21 04:07:07 crc kubenswrapper[4685]: I0321 04:07:07.191726 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xgm6\" (UniqueName: \"kubernetes.io/projected/a69263a8-bd4d-476c-99fc-f1202f36f8a0-kube-api-access-6xgm6\") pod \"a69263a8-bd4d-476c-99fc-f1202f36f8a0\" (UID: \"a69263a8-bd4d-476c-99fc-f1202f36f8a0\") " Mar 21 04:07:07 crc kubenswrapper[4685]: I0321 04:07:07.191791 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a69263a8-bd4d-476c-99fc-f1202f36f8a0-config-data-default\") pod \"a69263a8-bd4d-476c-99fc-f1202f36f8a0\" (UID: \"a69263a8-bd4d-476c-99fc-f1202f36f8a0\") " Mar 21 04:07:07 crc kubenswrapper[4685]: I0321 04:07:07.191821 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a69263a8-bd4d-476c-99fc-f1202f36f8a0-operator-scripts\") pod \"a69263a8-bd4d-476c-99fc-f1202f36f8a0\" (UID: \"a69263a8-bd4d-476c-99fc-f1202f36f8a0\") " Mar 21 04:07:07 crc kubenswrapper[4685]: I0321 04:07:07.191861 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a69263a8-bd4d-476c-99fc-f1202f36f8a0-config-data-generated\") pod \"a69263a8-bd4d-476c-99fc-f1202f36f8a0\" (UID: \"a69263a8-bd4d-476c-99fc-f1202f36f8a0\") " Mar 21 04:07:07 crc kubenswrapper[4685]: I0321 04:07:07.192564 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a69263a8-bd4d-476c-99fc-f1202f36f8a0-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "a69263a8-bd4d-476c-99fc-f1202f36f8a0" (UID: "a69263a8-bd4d-476c-99fc-f1202f36f8a0"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:07:07 crc kubenswrapper[4685]: I0321 04:07:07.192617 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a69263a8-bd4d-476c-99fc-f1202f36f8a0-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "a69263a8-bd4d-476c-99fc-f1202f36f8a0" (UID: "a69263a8-bd4d-476c-99fc-f1202f36f8a0"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:07:07 crc kubenswrapper[4685]: I0321 04:07:07.192695 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a69263a8-bd4d-476c-99fc-f1202f36f8a0-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "a69263a8-bd4d-476c-99fc-f1202f36f8a0" (UID: "a69263a8-bd4d-476c-99fc-f1202f36f8a0"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:07:07 crc kubenswrapper[4685]: I0321 04:07:07.193504 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a69263a8-bd4d-476c-99fc-f1202f36f8a0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a69263a8-bd4d-476c-99fc-f1202f36f8a0" (UID: "a69263a8-bd4d-476c-99fc-f1202f36f8a0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:07:07 crc kubenswrapper[4685]: I0321 04:07:07.195978 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a69263a8-bd4d-476c-99fc-f1202f36f8a0-kube-api-access-6xgm6" (OuterVolumeSpecName: "kube-api-access-6xgm6") pod "a69263a8-bd4d-476c-99fc-f1202f36f8a0" (UID: "a69263a8-bd4d-476c-99fc-f1202f36f8a0"). InnerVolumeSpecName "kube-api-access-6xgm6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:07:07 crc kubenswrapper[4685]: I0321 04:07:07.200892 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "mysql-db") pod "a69263a8-bd4d-476c-99fc-f1202f36f8a0" (UID: "a69263a8-bd4d-476c-99fc-f1202f36f8a0"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 21 04:07:07 crc kubenswrapper[4685]: I0321 04:07:07.227386 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/keystonee186-account-delete-ppwv7"] Mar 21 04:07:07 crc kubenswrapper[4685]: I0321 04:07:07.234569 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/keystonee186-account-delete-ppwv7"] Mar 21 04:07:07 crc kubenswrapper[4685]: I0321 04:07:07.293493 4685 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Mar 21 04:07:07 crc kubenswrapper[4685]: I0321 04:07:07.293531 4685 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a69263a8-bd4d-476c-99fc-f1202f36f8a0-kolla-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:07:07 crc kubenswrapper[4685]: I0321 04:07:07.293546 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xgm6\" (UniqueName: \"kubernetes.io/projected/a69263a8-bd4d-476c-99fc-f1202f36f8a0-kube-api-access-6xgm6\") on node \"crc\" DevicePath \"\"" Mar 21 04:07:07 crc kubenswrapper[4685]: I0321 04:07:07.293559 4685 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a69263a8-bd4d-476c-99fc-f1202f36f8a0-config-data-default\") on node \"crc\" DevicePath \"\"" Mar 21 04:07:07 crc kubenswrapper[4685]: I0321 04:07:07.293571 4685 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a69263a8-bd4d-476c-99fc-f1202f36f8a0-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:07:07 crc kubenswrapper[4685]: I0321 04:07:07.293583 4685 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a69263a8-bd4d-476c-99fc-f1202f36f8a0-config-data-generated\") on node \"crc\" DevicePath \"\"" Mar 21 04:07:07 crc kubenswrapper[4685]: I0321 04:07:07.308399 4685 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Mar 21 04:07:07 crc kubenswrapper[4685]: I0321 04:07:07.395588 4685 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Mar 21 04:07:07 crc kubenswrapper[4685]: I0321 04:07:07.914224 4685 generic.go:334] "Generic (PLEG): container finished" podID="a69263a8-bd4d-476c-99fc-f1202f36f8a0" containerID="6ef5c5d3b88cab93dc79845560aae01f51beff644d970c48e0096ebed008e109" exitCode=0 Mar 21 04:07:07 crc kubenswrapper[4685]: I0321 04:07:07.914291 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-0" event={"ID":"a69263a8-bd4d-476c-99fc-f1202f36f8a0","Type":"ContainerDied","Data":"6ef5c5d3b88cab93dc79845560aae01f51beff644d970c48e0096ebed008e109"} Mar 21 04:07:07 crc kubenswrapper[4685]: I0321 04:07:07.914297 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/openstack-galera-0" Mar 21 04:07:07 crc kubenswrapper[4685]: I0321 04:07:07.914315 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-0" event={"ID":"a69263a8-bd4d-476c-99fc-f1202f36f8a0","Type":"ContainerDied","Data":"825b5fff34eacc0ff1793d29d8ff3144045342d1536eff1ccad3868db4ab618f"} Mar 21 04:07:07 crc kubenswrapper[4685]: I0321 04:07:07.914338 4685 scope.go:117] "RemoveContainer" containerID="6ef5c5d3b88cab93dc79845560aae01f51beff644d970c48e0096ebed008e109" Mar 21 04:07:07 crc kubenswrapper[4685]: I0321 04:07:07.931614 4685 scope.go:117] "RemoveContainer" containerID="1a22266022bad721a506164b62c635ec56de1ed3b509234096cfdda97ceead41" Mar 21 04:07:07 crc kubenswrapper[4685]: I0321 04:07:07.946121 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/openstack-galera-0"] Mar 21 04:07:07 crc kubenswrapper[4685]: I0321 04:07:07.953600 4685 scope.go:117] "RemoveContainer" containerID="6ef5c5d3b88cab93dc79845560aae01f51beff644d970c48e0096ebed008e109" Mar 21 04:07:07 crc kubenswrapper[4685]: E0321 04:07:07.953957 4685 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ef5c5d3b88cab93dc79845560aae01f51beff644d970c48e0096ebed008e109\": container with ID starting with 6ef5c5d3b88cab93dc79845560aae01f51beff644d970c48e0096ebed008e109 not found: ID does not exist" containerID="6ef5c5d3b88cab93dc79845560aae01f51beff644d970c48e0096ebed008e109" Mar 21 04:07:07 crc kubenswrapper[4685]: I0321 04:07:07.953999 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ef5c5d3b88cab93dc79845560aae01f51beff644d970c48e0096ebed008e109"} err="failed to get container status \"6ef5c5d3b88cab93dc79845560aae01f51beff644d970c48e0096ebed008e109\": rpc error: code = NotFound desc = could not find container \"6ef5c5d3b88cab93dc79845560aae01f51beff644d970c48e0096ebed008e109\": container with ID starting with 6ef5c5d3b88cab93dc79845560aae01f51beff644d970c48e0096ebed008e109 not found: ID does not exist" Mar 21 04:07:07 crc kubenswrapper[4685]: I0321 04:07:07.954024 4685 scope.go:117] "RemoveContainer" containerID="1a22266022bad721a506164b62c635ec56de1ed3b509234096cfdda97ceead41" Mar 21 04:07:07 crc kubenswrapper[4685]: E0321 04:07:07.954441 4685 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a22266022bad721a506164b62c635ec56de1ed3b509234096cfdda97ceead41\": container with ID starting with 1a22266022bad721a506164b62c635ec56de1ed3b509234096cfdda97ceead41 not found: ID does not exist" containerID="1a22266022bad721a506164b62c635ec56de1ed3b509234096cfdda97ceead41" Mar 21 04:07:07 crc kubenswrapper[4685]: I0321 04:07:07.954471 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a22266022bad721a506164b62c635ec56de1ed3b509234096cfdda97ceead41"} err="failed to get container status \"1a22266022bad721a506164b62c635ec56de1ed3b509234096cfdda97ceead41\": rpc error: code = NotFound desc = could not find container \"1a22266022bad721a506164b62c635ec56de1ed3b509234096cfdda97ceead41\": container with ID starting with 1a22266022bad721a506164b62c635ec56de1ed3b509234096cfdda97ceead41 not found: ID does not exist" Mar 21 04:07:07 crc kubenswrapper[4685]: I0321 04:07:07.955406 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/openstack-galera-0"] Mar 21 04:07:08 crc kubenswrapper[4685]: I0321 04:07:08.293537 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5c456858cb-jnxfq"] Mar 21 04:07:08 crc kubenswrapper[4685]: I0321 04:07:08.293768 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/keystone-operator-controller-manager-5c456858cb-jnxfq" podUID="7e33f8cc-f6fa-48ab-a172-74892478c268" containerName="manager" containerID="cri-o://e2d92cc4f8795aed53ebcd983b89eb996fa6974a422c65d6956148de1c17f3c7" gracePeriod=10 Mar 21 04:07:08 crc kubenswrapper[4685]: I0321 04:07:08.310433 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f046207-c975-4417-89f1-650002978bca" path="/var/lib/kubelet/pods/1f046207-c975-4417-89f1-650002978bca/volumes" Mar 21 04:07:08 crc kubenswrapper[4685]: I0321 04:07:08.311193 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="561e0682-6870-4559-b124-759d2f133a56" path="/var/lib/kubelet/pods/561e0682-6870-4559-b124-759d2f133a56/volumes" Mar 21 04:07:08 crc kubenswrapper[4685]: I0321 04:07:08.311659 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83bd1c48-1210-4b1a-af10-08b876ec7665" path="/var/lib/kubelet/pods/83bd1c48-1210-4b1a-af10-08b876ec7665/volumes" Mar 21 04:07:08 crc kubenswrapper[4685]: I0321 04:07:08.312654 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a69263a8-bd4d-476c-99fc-f1202f36f8a0" path="/var/lib/kubelet/pods/a69263a8-bd4d-476c-99fc-f1202f36f8a0/volumes" Mar 21 04:07:08 crc kubenswrapper[4685]: I0321 04:07:08.313250 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba532d6b-607c-450f-adb7-8d4e14ff58e0" path="/var/lib/kubelet/pods/ba532d6b-607c-450f-adb7-8d4e14ff58e0/volumes" Mar 21 04:07:08 crc kubenswrapper[4685]: I0321 04:07:08.606168 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-index-h4hc9"] Mar 21 04:07:08 crc kubenswrapper[4685]: I0321 04:07:08.606645 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/keystone-operator-index-h4hc9" podUID="e92e8d0a-5e30-4648-b7b5-9b6040db75f0" containerName="registry-server" containerID="cri-o://bd6b7473b6bccdaa548f8ab30e82f3ea62e21f216d4fe508e8351b54c92bc0cb" gracePeriod=30 Mar 21 04:07:08 crc kubenswrapper[4685]: I0321 04:07:08.630240 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/25b5da83b847401861089ca4a9b7591d95e0d659639006dce400c423edngstq"] Mar 21 04:07:08 crc kubenswrapper[4685]: I0321 04:07:08.638576 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/25b5da83b847401861089ca4a9b7591d95e0d659639006dce400c423edngstq"] Mar 21 04:07:08 crc kubenswrapper[4685]: I0321 04:07:08.751940 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-5c456858cb-jnxfq" Mar 21 04:07:08 crc kubenswrapper[4685]: I0321 04:07:08.816392 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7e33f8cc-f6fa-48ab-a172-74892478c268-webhook-cert\") pod \"7e33f8cc-f6fa-48ab-a172-74892478c268\" (UID: \"7e33f8cc-f6fa-48ab-a172-74892478c268\") " Mar 21 04:07:08 crc kubenswrapper[4685]: I0321 04:07:08.816456 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4692\" (UniqueName: \"kubernetes.io/projected/7e33f8cc-f6fa-48ab-a172-74892478c268-kube-api-access-q4692\") pod \"7e33f8cc-f6fa-48ab-a172-74892478c268\" (UID: \"7e33f8cc-f6fa-48ab-a172-74892478c268\") " Mar 21 04:07:08 crc kubenswrapper[4685]: I0321 04:07:08.816492 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7e33f8cc-f6fa-48ab-a172-74892478c268-apiservice-cert\") pod \"7e33f8cc-f6fa-48ab-a172-74892478c268\" (UID: \"7e33f8cc-f6fa-48ab-a172-74892478c268\") " Mar 21 04:07:08 crc kubenswrapper[4685]: I0321 04:07:08.827686 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e33f8cc-f6fa-48ab-a172-74892478c268-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "7e33f8cc-f6fa-48ab-a172-74892478c268" (UID: "7e33f8cc-f6fa-48ab-a172-74892478c268"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:07:08 crc kubenswrapper[4685]: I0321 04:07:08.827764 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e33f8cc-f6fa-48ab-a172-74892478c268-kube-api-access-q4692" (OuterVolumeSpecName: "kube-api-access-q4692") pod "7e33f8cc-f6fa-48ab-a172-74892478c268" (UID: "7e33f8cc-f6fa-48ab-a172-74892478c268"). InnerVolumeSpecName "kube-api-access-q4692". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:07:08 crc kubenswrapper[4685]: I0321 04:07:08.828035 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e33f8cc-f6fa-48ab-a172-74892478c268-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "7e33f8cc-f6fa-48ab-a172-74892478c268" (UID: "7e33f8cc-f6fa-48ab-a172-74892478c268"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:07:08 crc kubenswrapper[4685]: I0321 04:07:08.921170 4685 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7e33f8cc-f6fa-48ab-a172-74892478c268-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:07:08 crc kubenswrapper[4685]: I0321 04:07:08.921228 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q4692\" (UniqueName: \"kubernetes.io/projected/7e33f8cc-f6fa-48ab-a172-74892478c268-kube-api-access-q4692\") on node \"crc\" DevicePath \"\"" Mar 21 04:07:08 crc kubenswrapper[4685]: I0321 04:07:08.921240 4685 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7e33f8cc-f6fa-48ab-a172-74892478c268-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:07:08 crc kubenswrapper[4685]: I0321 04:07:08.950788 4685 generic.go:334] "Generic (PLEG): container finished" podID="7e33f8cc-f6fa-48ab-a172-74892478c268" containerID="e2d92cc4f8795aed53ebcd983b89eb996fa6974a422c65d6956148de1c17f3c7" exitCode=0 Mar 21 04:07:08 crc kubenswrapper[4685]: I0321 04:07:08.950910 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5c456858cb-jnxfq" event={"ID":"7e33f8cc-f6fa-48ab-a172-74892478c268","Type":"ContainerDied","Data":"e2d92cc4f8795aed53ebcd983b89eb996fa6974a422c65d6956148de1c17f3c7"} Mar 21 04:07:08 crc kubenswrapper[4685]: I0321 04:07:08.950952 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5c456858cb-jnxfq" event={"ID":"7e33f8cc-f6fa-48ab-a172-74892478c268","Type":"ContainerDied","Data":"7aae4e911c8c43ac4d1f9e8baa7e39dfa7328d06acda028d8d2ce5a273801709"} Mar 21 04:07:08 crc kubenswrapper[4685]: I0321 04:07:08.950974 4685 scope.go:117] "RemoveContainer" containerID="e2d92cc4f8795aed53ebcd983b89eb996fa6974a422c65d6956148de1c17f3c7" Mar 21 04:07:08 crc kubenswrapper[4685]: I0321 04:07:08.951016 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-5c456858cb-jnxfq" Mar 21 04:07:08 crc kubenswrapper[4685]: I0321 04:07:08.959170 4685 generic.go:334] "Generic (PLEG): container finished" podID="e92e8d0a-5e30-4648-b7b5-9b6040db75f0" containerID="bd6b7473b6bccdaa548f8ab30e82f3ea62e21f216d4fe508e8351b54c92bc0cb" exitCode=0 Mar 21 04:07:08 crc kubenswrapper[4685]: I0321 04:07:08.959444 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-h4hc9" event={"ID":"e92e8d0a-5e30-4648-b7b5-9b6040db75f0","Type":"ContainerDied","Data":"bd6b7473b6bccdaa548f8ab30e82f3ea62e21f216d4fe508e8351b54c92bc0cb"} Mar 21 04:07:09 crc kubenswrapper[4685]: I0321 04:07:09.006305 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-h4hc9" Mar 21 04:07:09 crc kubenswrapper[4685]: I0321 04:07:09.017534 4685 scope.go:117] "RemoveContainer" containerID="e2d92cc4f8795aed53ebcd983b89eb996fa6974a422c65d6956148de1c17f3c7" Mar 21 04:07:09 crc kubenswrapper[4685]: E0321 04:07:09.018131 4685 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2d92cc4f8795aed53ebcd983b89eb996fa6974a422c65d6956148de1c17f3c7\": container with ID starting with e2d92cc4f8795aed53ebcd983b89eb996fa6974a422c65d6956148de1c17f3c7 not found: ID does not exist" containerID="e2d92cc4f8795aed53ebcd983b89eb996fa6974a422c65d6956148de1c17f3c7" Mar 21 04:07:09 crc kubenswrapper[4685]: I0321 04:07:09.018160 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2d92cc4f8795aed53ebcd983b89eb996fa6974a422c65d6956148de1c17f3c7"} err="failed to get container status \"e2d92cc4f8795aed53ebcd983b89eb996fa6974a422c65d6956148de1c17f3c7\": rpc error: code = NotFound desc = could not find container \"e2d92cc4f8795aed53ebcd983b89eb996fa6974a422c65d6956148de1c17f3c7\": container with ID starting with e2d92cc4f8795aed53ebcd983b89eb996fa6974a422c65d6956148de1c17f3c7 not found: ID does not exist" Mar 21 04:07:09 crc kubenswrapper[4685]: I0321 04:07:09.023745 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5c456858cb-jnxfq"] Mar 21 04:07:09 crc kubenswrapper[4685]: I0321 04:07:09.027669 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5c456858cb-jnxfq"] Mar 21 04:07:09 crc kubenswrapper[4685]: I0321 04:07:09.123379 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgm47\" (UniqueName: \"kubernetes.io/projected/e92e8d0a-5e30-4648-b7b5-9b6040db75f0-kube-api-access-rgm47\") pod \"e92e8d0a-5e30-4648-b7b5-9b6040db75f0\" (UID: \"e92e8d0a-5e30-4648-b7b5-9b6040db75f0\") " Mar 21 04:07:09 crc kubenswrapper[4685]: I0321 04:07:09.127780 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e92e8d0a-5e30-4648-b7b5-9b6040db75f0-kube-api-access-rgm47" (OuterVolumeSpecName: "kube-api-access-rgm47") pod "e92e8d0a-5e30-4648-b7b5-9b6040db75f0" (UID: "e92e8d0a-5e30-4648-b7b5-9b6040db75f0"). InnerVolumeSpecName "kube-api-access-rgm47". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:07:09 crc kubenswrapper[4685]: I0321 04:07:09.225139 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rgm47\" (UniqueName: \"kubernetes.io/projected/e92e8d0a-5e30-4648-b7b5-9b6040db75f0-kube-api-access-rgm47\") on node \"crc\" DevicePath \"\"" Mar 21 04:07:09 crc kubenswrapper[4685]: I0321 04:07:09.483292 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-tf44w"] Mar 21 04:07:09 crc kubenswrapper[4685]: I0321 04:07:09.483494 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-tf44w" podUID="5397a9a1-b670-42a8-8515-8cf15e8aa2d4" containerName="operator" containerID="cri-o://93ff44a4581786ab19bb098a9133e5ff804e505fdec4b90e75879f470a42909c" gracePeriod=10 Mar 21 04:07:09 crc kubenswrapper[4685]: I0321 04:07:09.811471 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-tf44w" Mar 21 04:07:09 crc kubenswrapper[4685]: I0321 04:07:09.817289 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-vl4k5"] Mar 21 04:07:09 crc kubenswrapper[4685]: I0321 04:07:09.817516 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/rabbitmq-cluster-operator-index-vl4k5" podUID="f07c1e30-7d41-456a-adb4-2d042b562bf7" containerName="registry-server" containerID="cri-o://99d4d708ebba1cf37ef65d04f6a9587ba4efa402ad9d1d22b7762f35a5e9e79e" gracePeriod=30 Mar 21 04:07:09 crc kubenswrapper[4685]: I0321 04:07:09.843869 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qzckr"] Mar 21 04:07:09 crc kubenswrapper[4685]: I0321 04:07:09.846683 4685 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/rabbitmq-cluster-operator-index-vl4k5" podUID="f07c1e30-7d41-456a-adb4-2d042b562bf7" containerName="registry-server" probeResult="failure" output="" Mar 21 04:07:09 crc kubenswrapper[4685]: I0321 04:07:09.852286 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590qzckr"] Mar 21 04:07:09 crc kubenswrapper[4685]: I0321 04:07:09.945645 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rr5j8\" (UniqueName: \"kubernetes.io/projected/5397a9a1-b670-42a8-8515-8cf15e8aa2d4-kube-api-access-rr5j8\") pod \"5397a9a1-b670-42a8-8515-8cf15e8aa2d4\" (UID: \"5397a9a1-b670-42a8-8515-8cf15e8aa2d4\") " Mar 21 04:07:09 crc kubenswrapper[4685]: I0321 04:07:09.949065 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5397a9a1-b670-42a8-8515-8cf15e8aa2d4-kube-api-access-rr5j8" (OuterVolumeSpecName: "kube-api-access-rr5j8") pod "5397a9a1-b670-42a8-8515-8cf15e8aa2d4" (UID: "5397a9a1-b670-42a8-8515-8cf15e8aa2d4"). InnerVolumeSpecName "kube-api-access-rr5j8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:07:09 crc kubenswrapper[4685]: I0321 04:07:09.967640 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-h4hc9" event={"ID":"e92e8d0a-5e30-4648-b7b5-9b6040db75f0","Type":"ContainerDied","Data":"4fa08531c0227dea6527b5d8c58c0efa2feb05317ccc9cd694eb6d7c9918d176"} Mar 21 04:07:09 crc kubenswrapper[4685]: I0321 04:07:09.967686 4685 scope.go:117] "RemoveContainer" containerID="bd6b7473b6bccdaa548f8ab30e82f3ea62e21f216d4fe508e8351b54c92bc0cb" Mar 21 04:07:09 crc kubenswrapper[4685]: I0321 04:07:09.967790 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-h4hc9" Mar 21 04:07:09 crc kubenswrapper[4685]: I0321 04:07:09.970875 4685 generic.go:334] "Generic (PLEG): container finished" podID="5397a9a1-b670-42a8-8515-8cf15e8aa2d4" containerID="93ff44a4581786ab19bb098a9133e5ff804e505fdec4b90e75879f470a42909c" exitCode=0 Mar 21 04:07:09 crc kubenswrapper[4685]: I0321 04:07:09.970911 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-tf44w" Mar 21 04:07:09 crc kubenswrapper[4685]: I0321 04:07:09.970940 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-tf44w" event={"ID":"5397a9a1-b670-42a8-8515-8cf15e8aa2d4","Type":"ContainerDied","Data":"93ff44a4581786ab19bb098a9133e5ff804e505fdec4b90e75879f470a42909c"} Mar 21 04:07:09 crc kubenswrapper[4685]: I0321 04:07:09.970962 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-tf44w" event={"ID":"5397a9a1-b670-42a8-8515-8cf15e8aa2d4","Type":"ContainerDied","Data":"e54c05fc1782ad427efdac278a923b7065f8e255ba178f5febcda6ad0aa3364b"} Mar 21 04:07:09 crc kubenswrapper[4685]: I0321 04:07:09.972404 4685 generic.go:334] "Generic (PLEG): container finished" podID="f07c1e30-7d41-456a-adb4-2d042b562bf7" containerID="99d4d708ebba1cf37ef65d04f6a9587ba4efa402ad9d1d22b7762f35a5e9e79e" exitCode=0 Mar 21 04:07:09 crc kubenswrapper[4685]: I0321 04:07:09.972435 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-vl4k5" event={"ID":"f07c1e30-7d41-456a-adb4-2d042b562bf7","Type":"ContainerDied","Data":"99d4d708ebba1cf37ef65d04f6a9587ba4efa402ad9d1d22b7762f35a5e9e79e"} Mar 21 04:07:09 crc kubenswrapper[4685]: I0321 04:07:09.987656 4685 scope.go:117] "RemoveContainer" containerID="93ff44a4581786ab19bb098a9133e5ff804e505fdec4b90e75879f470a42909c" Mar 21 04:07:10 crc kubenswrapper[4685]: I0321 04:07:10.008273 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-tf44w"] Mar 21 04:07:10 crc kubenswrapper[4685]: I0321 04:07:10.013766 4685 scope.go:117] "RemoveContainer" containerID="93ff44a4581786ab19bb098a9133e5ff804e505fdec4b90e75879f470a42909c" Mar 21 04:07:10 crc kubenswrapper[4685]: I0321 04:07:10.014091 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-tf44w"] Mar 21 04:07:10 crc kubenswrapper[4685]: E0321 04:07:10.014126 4685 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93ff44a4581786ab19bb098a9133e5ff804e505fdec4b90e75879f470a42909c\": container with ID starting with 93ff44a4581786ab19bb098a9133e5ff804e505fdec4b90e75879f470a42909c not found: ID does not exist" containerID="93ff44a4581786ab19bb098a9133e5ff804e505fdec4b90e75879f470a42909c" Mar 21 04:07:10 crc kubenswrapper[4685]: I0321 04:07:10.014159 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93ff44a4581786ab19bb098a9133e5ff804e505fdec4b90e75879f470a42909c"} err="failed to get container status \"93ff44a4581786ab19bb098a9133e5ff804e505fdec4b90e75879f470a42909c\": rpc error: code = NotFound desc = could not find container \"93ff44a4581786ab19bb098a9133e5ff804e505fdec4b90e75879f470a42909c\": container with ID starting with 93ff44a4581786ab19bb098a9133e5ff804e505fdec4b90e75879f470a42909c not found: ID does not exist" Mar 21 04:07:10 crc kubenswrapper[4685]: I0321 04:07:10.018993 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-index-h4hc9"] Mar 21 04:07:10 crc kubenswrapper[4685]: I0321 04:07:10.022396 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/keystone-operator-index-h4hc9"] Mar 21 04:07:10 crc kubenswrapper[4685]: I0321 04:07:10.048591 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rr5j8\" (UniqueName: \"kubernetes.io/projected/5397a9a1-b670-42a8-8515-8cf15e8aa2d4-kube-api-access-rr5j8\") on node \"crc\" DevicePath \"\"" Mar 21 04:07:10 crc kubenswrapper[4685]: I0321 04:07:10.182943 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-vl4k5" Mar 21 04:07:10 crc kubenswrapper[4685]: I0321 04:07:10.250145 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v8r4n\" (UniqueName: \"kubernetes.io/projected/f07c1e30-7d41-456a-adb4-2d042b562bf7-kube-api-access-v8r4n\") pod \"f07c1e30-7d41-456a-adb4-2d042b562bf7\" (UID: \"f07c1e30-7d41-456a-adb4-2d042b562bf7\") " Mar 21 04:07:10 crc kubenswrapper[4685]: I0321 04:07:10.252561 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f07c1e30-7d41-456a-adb4-2d042b562bf7-kube-api-access-v8r4n" (OuterVolumeSpecName: "kube-api-access-v8r4n") pod "f07c1e30-7d41-456a-adb4-2d042b562bf7" (UID: "f07c1e30-7d41-456a-adb4-2d042b562bf7"). InnerVolumeSpecName "kube-api-access-v8r4n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:07:10 crc kubenswrapper[4685]: I0321 04:07:10.312971 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="261fe7a7-2f57-4959-b23a-0752118908c9" path="/var/lib/kubelet/pods/261fe7a7-2f57-4959-b23a-0752118908c9/volumes" Mar 21 04:07:10 crc kubenswrapper[4685]: I0321 04:07:10.313559 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5397a9a1-b670-42a8-8515-8cf15e8aa2d4" path="/var/lib/kubelet/pods/5397a9a1-b670-42a8-8515-8cf15e8aa2d4/volumes" Mar 21 04:07:10 crc kubenswrapper[4685]: I0321 04:07:10.314002 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e33f8cc-f6fa-48ab-a172-74892478c268" path="/var/lib/kubelet/pods/7e33f8cc-f6fa-48ab-a172-74892478c268/volumes" Mar 21 04:07:10 crc kubenswrapper[4685]: I0321 04:07:10.314859 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4b6e003-448d-437f-805e-5dd92c8ea2aa" path="/var/lib/kubelet/pods/b4b6e003-448d-437f-805e-5dd92c8ea2aa/volumes" Mar 21 04:07:10 crc kubenswrapper[4685]: I0321 04:07:10.315361 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e92e8d0a-5e30-4648-b7b5-9b6040db75f0" path="/var/lib/kubelet/pods/e92e8d0a-5e30-4648-b7b5-9b6040db75f0/volumes" Mar 21 04:07:10 crc kubenswrapper[4685]: I0321 04:07:10.351771 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v8r4n\" (UniqueName: \"kubernetes.io/projected/f07c1e30-7d41-456a-adb4-2d042b562bf7-kube-api-access-v8r4n\") on node \"crc\" DevicePath \"\"" Mar 21 04:07:10 crc kubenswrapper[4685]: I0321 04:07:10.984282 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-vl4k5" event={"ID":"f07c1e30-7d41-456a-adb4-2d042b562bf7","Type":"ContainerDied","Data":"ea86bedfd21c076e329bdb65d6a975f371031d383f9a853c61d4426eedd8a1f2"} Mar 21 04:07:10 crc kubenswrapper[4685]: I0321 04:07:10.984323 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-vl4k5" Mar 21 04:07:10 crc kubenswrapper[4685]: I0321 04:07:10.984339 4685 scope.go:117] "RemoveContainer" containerID="99d4d708ebba1cf37ef65d04f6a9587ba4efa402ad9d1d22b7762f35a5e9e79e" Mar 21 04:07:11 crc kubenswrapper[4685]: I0321 04:07:11.010152 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-vl4k5"] Mar 21 04:07:11 crc kubenswrapper[4685]: I0321 04:07:11.014741 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-vl4k5"] Mar 21 04:07:12 crc kubenswrapper[4685]: I0321 04:07:12.308517 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f07c1e30-7d41-456a-adb4-2d042b562bf7" path="/var/lib/kubelet/pods/f07c1e30-7d41-456a-adb4-2d042b562bf7/volumes" Mar 21 04:07:12 crc kubenswrapper[4685]: E0321 04:07:12.691012 4685 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e33f8cc_f6fa_48ab_a172_74892478c268.slice/crio-e2d92cc4f8795aed53ebcd983b89eb996fa6974a422c65d6956148de1c17f3c7.scope\": RecentStats: unable to find data in memory cache]" Mar 21 04:07:15 crc kubenswrapper[4685]: I0321 04:07:15.397544 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-controller-manager-6c46fc7bc5-hpt7d"] Mar 21 04:07:15 crc kubenswrapper[4685]: I0321 04:07:15.398282 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/infra-operator-controller-manager-6c46fc7bc5-hpt7d" podUID="7a568963-c78a-41ee-ab5a-25f5d1eb0bb5" containerName="manager" containerID="cri-o://e9f6faedafec22c9e6bfb53ba5c487a41d1648f4b7bf69febdc4bddbd41264a4" gracePeriod=10 Mar 21 04:07:15 crc kubenswrapper[4685]: I0321 04:07:15.744007 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-index-2d98g"] Mar 21 04:07:15 crc kubenswrapper[4685]: I0321 04:07:15.744725 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/infra-operator-index-2d98g" podUID="2599b123-88b7-41bb-981a-ce52020584c9" containerName="registry-server" containerID="cri-o://cbfb74167f21b03e55f48da2e64caaef2a70c86f9394cda21d8a5d2101b088dc" gracePeriod=30 Mar 21 04:07:15 crc kubenswrapper[4685]: E0321 04:07:15.765925 4685 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of cbfb74167f21b03e55f48da2e64caaef2a70c86f9394cda21d8a5d2101b088dc is running failed: container process not found" containerID="cbfb74167f21b03e55f48da2e64caaef2a70c86f9394cda21d8a5d2101b088dc" cmd=["grpc_health_probe","-addr=:50051"] Mar 21 04:07:15 crc kubenswrapper[4685]: E0321 04:07:15.766486 4685 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of cbfb74167f21b03e55f48da2e64caaef2a70c86f9394cda21d8a5d2101b088dc is running failed: container process not found" containerID="cbfb74167f21b03e55f48da2e64caaef2a70c86f9394cda21d8a5d2101b088dc" cmd=["grpc_health_probe","-addr=:50051"] Mar 21 04:07:15 crc kubenswrapper[4685]: E0321 04:07:15.766862 4685 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of cbfb74167f21b03e55f48da2e64caaef2a70c86f9394cda21d8a5d2101b088dc is running failed: container process not found" containerID="cbfb74167f21b03e55f48da2e64caaef2a70c86f9394cda21d8a5d2101b088dc" cmd=["grpc_health_probe","-addr=:50051"] Mar 21 04:07:15 crc kubenswrapper[4685]: E0321 04:07:15.766969 4685 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of cbfb74167f21b03e55f48da2e64caaef2a70c86f9394cda21d8a5d2101b088dc is running failed: container process not found" probeType="Readiness" pod="openstack-operators/infra-operator-index-2d98g" podUID="2599b123-88b7-41bb-981a-ce52020584c9" containerName="registry-server" Mar 21 04:07:15 crc kubenswrapper[4685]: I0321 04:07:15.799875 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/014ac6de79e0394b0d824e62fe3d0678564789565fd857972d41b2afd4f9qmw"] Mar 21 04:07:15 crc kubenswrapper[4685]: I0321 04:07:15.819448 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/014ac6de79e0394b0d824e62fe3d0678564789565fd857972d41b2afd4f9qmw"] Mar 21 04:07:15 crc kubenswrapper[4685]: I0321 04:07:15.896952 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-6c46fc7bc5-hpt7d" Mar 21 04:07:16 crc kubenswrapper[4685]: I0321 04:07:16.023277 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7a568963-c78a-41ee-ab5a-25f5d1eb0bb5-apiservice-cert\") pod \"7a568963-c78a-41ee-ab5a-25f5d1eb0bb5\" (UID: \"7a568963-c78a-41ee-ab5a-25f5d1eb0bb5\") " Mar 21 04:07:16 crc kubenswrapper[4685]: I0321 04:07:16.023318 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7a568963-c78a-41ee-ab5a-25f5d1eb0bb5-webhook-cert\") pod \"7a568963-c78a-41ee-ab5a-25f5d1eb0bb5\" (UID: \"7a568963-c78a-41ee-ab5a-25f5d1eb0bb5\") " Mar 21 04:07:16 crc kubenswrapper[4685]: I0321 04:07:16.023378 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x9hrt\" (UniqueName: \"kubernetes.io/projected/7a568963-c78a-41ee-ab5a-25f5d1eb0bb5-kube-api-access-x9hrt\") pod \"7a568963-c78a-41ee-ab5a-25f5d1eb0bb5\" (UID: \"7a568963-c78a-41ee-ab5a-25f5d1eb0bb5\") " Mar 21 04:07:16 crc kubenswrapper[4685]: I0321 04:07:16.034114 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a568963-c78a-41ee-ab5a-25f5d1eb0bb5-kube-api-access-x9hrt" (OuterVolumeSpecName: "kube-api-access-x9hrt") pod "7a568963-c78a-41ee-ab5a-25f5d1eb0bb5" (UID: "7a568963-c78a-41ee-ab5a-25f5d1eb0bb5"). InnerVolumeSpecName "kube-api-access-x9hrt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:07:16 crc kubenswrapper[4685]: I0321 04:07:16.034214 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a568963-c78a-41ee-ab5a-25f5d1eb0bb5-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "7a568963-c78a-41ee-ab5a-25f5d1eb0bb5" (UID: "7a568963-c78a-41ee-ab5a-25f5d1eb0bb5"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:07:16 crc kubenswrapper[4685]: I0321 04:07:16.038450 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a568963-c78a-41ee-ab5a-25f5d1eb0bb5-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "7a568963-c78a-41ee-ab5a-25f5d1eb0bb5" (UID: "7a568963-c78a-41ee-ab5a-25f5d1eb0bb5"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:07:16 crc kubenswrapper[4685]: I0321 04:07:16.041336 4685 generic.go:334] "Generic (PLEG): container finished" podID="2599b123-88b7-41bb-981a-ce52020584c9" containerID="cbfb74167f21b03e55f48da2e64caaef2a70c86f9394cda21d8a5d2101b088dc" exitCode=0 Mar 21 04:07:16 crc kubenswrapper[4685]: I0321 04:07:16.041458 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-2d98g" event={"ID":"2599b123-88b7-41bb-981a-ce52020584c9","Type":"ContainerDied","Data":"cbfb74167f21b03e55f48da2e64caaef2a70c86f9394cda21d8a5d2101b088dc"} Mar 21 04:07:16 crc kubenswrapper[4685]: I0321 04:07:16.044380 4685 generic.go:334] "Generic (PLEG): container finished" podID="7a568963-c78a-41ee-ab5a-25f5d1eb0bb5" containerID="e9f6faedafec22c9e6bfb53ba5c487a41d1648f4b7bf69febdc4bddbd41264a4" exitCode=0 Mar 21 04:07:16 crc kubenswrapper[4685]: I0321 04:07:16.044413 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-6c46fc7bc5-hpt7d" event={"ID":"7a568963-c78a-41ee-ab5a-25f5d1eb0bb5","Type":"ContainerDied","Data":"e9f6faedafec22c9e6bfb53ba5c487a41d1648f4b7bf69febdc4bddbd41264a4"} Mar 21 04:07:16 crc kubenswrapper[4685]: I0321 04:07:16.044437 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-6c46fc7bc5-hpt7d" event={"ID":"7a568963-c78a-41ee-ab5a-25f5d1eb0bb5","Type":"ContainerDied","Data":"b0755a995f7eda0c84ac26eccb0bcda2b2c98831ecc1e853b3b0e9f8ae884b66"} Mar 21 04:07:16 crc kubenswrapper[4685]: I0321 04:07:16.044455 4685 scope.go:117] "RemoveContainer" containerID="e9f6faedafec22c9e6bfb53ba5c487a41d1648f4b7bf69febdc4bddbd41264a4" Mar 21 04:07:16 crc kubenswrapper[4685]: I0321 04:07:16.044456 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-6c46fc7bc5-hpt7d" Mar 21 04:07:16 crc kubenswrapper[4685]: I0321 04:07:16.065155 4685 scope.go:117] "RemoveContainer" containerID="e9f6faedafec22c9e6bfb53ba5c487a41d1648f4b7bf69febdc4bddbd41264a4" Mar 21 04:07:16 crc kubenswrapper[4685]: E0321 04:07:16.066130 4685 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9f6faedafec22c9e6bfb53ba5c487a41d1648f4b7bf69febdc4bddbd41264a4\": container with ID starting with e9f6faedafec22c9e6bfb53ba5c487a41d1648f4b7bf69febdc4bddbd41264a4 not found: ID does not exist" containerID="e9f6faedafec22c9e6bfb53ba5c487a41d1648f4b7bf69febdc4bddbd41264a4" Mar 21 04:07:16 crc kubenswrapper[4685]: I0321 04:07:16.066228 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9f6faedafec22c9e6bfb53ba5c487a41d1648f4b7bf69febdc4bddbd41264a4"} err="failed to get container status \"e9f6faedafec22c9e6bfb53ba5c487a41d1648f4b7bf69febdc4bddbd41264a4\": rpc error: code = NotFound desc = could not find container \"e9f6faedafec22c9e6bfb53ba5c487a41d1648f4b7bf69febdc4bddbd41264a4\": container with ID starting with e9f6faedafec22c9e6bfb53ba5c487a41d1648f4b7bf69febdc4bddbd41264a4 not found: ID does not exist" Mar 21 04:07:16 crc kubenswrapper[4685]: I0321 04:07:16.077966 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-controller-manager-6c46fc7bc5-hpt7d"] Mar 21 04:07:16 crc kubenswrapper[4685]: I0321 04:07:16.086599 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/infra-operator-controller-manager-6c46fc7bc5-hpt7d"] Mar 21 04:07:16 crc kubenswrapper[4685]: I0321 04:07:16.086997 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-2d98g" Mar 21 04:07:16 crc kubenswrapper[4685]: I0321 04:07:16.132890 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x9hrt\" (UniqueName: \"kubernetes.io/projected/7a568963-c78a-41ee-ab5a-25f5d1eb0bb5-kube-api-access-x9hrt\") on node \"crc\" DevicePath \"\"" Mar 21 04:07:16 crc kubenswrapper[4685]: I0321 04:07:16.132929 4685 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7a568963-c78a-41ee-ab5a-25f5d1eb0bb5-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:07:16 crc kubenswrapper[4685]: I0321 04:07:16.132942 4685 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7a568963-c78a-41ee-ab5a-25f5d1eb0bb5-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:07:16 crc kubenswrapper[4685]: I0321 04:07:16.233483 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5hv7\" (UniqueName: \"kubernetes.io/projected/2599b123-88b7-41bb-981a-ce52020584c9-kube-api-access-k5hv7\") pod \"2599b123-88b7-41bb-981a-ce52020584c9\" (UID: \"2599b123-88b7-41bb-981a-ce52020584c9\") " Mar 21 04:07:16 crc kubenswrapper[4685]: I0321 04:07:16.236658 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2599b123-88b7-41bb-981a-ce52020584c9-kube-api-access-k5hv7" (OuterVolumeSpecName: "kube-api-access-k5hv7") pod "2599b123-88b7-41bb-981a-ce52020584c9" (UID: "2599b123-88b7-41bb-981a-ce52020584c9"). InnerVolumeSpecName "kube-api-access-k5hv7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:07:16 crc kubenswrapper[4685]: I0321 04:07:16.308107 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a568963-c78a-41ee-ab5a-25f5d1eb0bb5" path="/var/lib/kubelet/pods/7a568963-c78a-41ee-ab5a-25f5d1eb0bb5/volumes" Mar 21 04:07:16 crc kubenswrapper[4685]: I0321 04:07:16.309171 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6f4b7f3-a938-4717-a6aa-cc9b620738b0" path="/var/lib/kubelet/pods/e6f4b7f3-a938-4717-a6aa-cc9b620738b0/volumes" Mar 21 04:07:16 crc kubenswrapper[4685]: I0321 04:07:16.335269 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k5hv7\" (UniqueName: \"kubernetes.io/projected/2599b123-88b7-41bb-981a-ce52020584c9-kube-api-access-k5hv7\") on node \"crc\" DevicePath \"\"" Mar 21 04:07:16 crc kubenswrapper[4685]: I0321 04:07:16.810317 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6697764dc7-vzzbc"] Mar 21 04:07:16 crc kubenswrapper[4685]: I0321 04:07:16.810574 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/mariadb-operator-controller-manager-6697764dc7-vzzbc" podUID="3be15e47-4dd7-4e37-81ac-a3e7ec139af1" containerName="manager" containerID="cri-o://b5c9b4de08deb332e3fc18bfe577959d3b1748a282928fd0e7147671831d4c01" gracePeriod=10 Mar 21 04:07:17 crc kubenswrapper[4685]: I0321 04:07:17.038961 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-v8f9z"] Mar 21 04:07:17 crc kubenswrapper[4685]: I0321 04:07:17.039505 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/mariadb-operator-index-v8f9z" podUID="0a32f016-2085-4f24-83d7-68a6a01d1f02" containerName="registry-server" containerID="cri-o://9c4fa5b79b74d881d929682dcc86b2b780b6847ad69b900b5a0deabf5967e119" gracePeriod=30 Mar 21 04:07:17 crc kubenswrapper[4685]: I0321 04:07:17.051562 4685 generic.go:334] "Generic (PLEG): container finished" podID="3be15e47-4dd7-4e37-81ac-a3e7ec139af1" containerID="b5c9b4de08deb332e3fc18bfe577959d3b1748a282928fd0e7147671831d4c01" exitCode=0 Mar 21 04:07:17 crc kubenswrapper[4685]: I0321 04:07:17.051610 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6697764dc7-vzzbc" event={"ID":"3be15e47-4dd7-4e37-81ac-a3e7ec139af1","Type":"ContainerDied","Data":"b5c9b4de08deb332e3fc18bfe577959d3b1748a282928fd0e7147671831d4c01"} Mar 21 04:07:17 crc kubenswrapper[4685]: I0321 04:07:17.055821 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-2d98g" event={"ID":"2599b123-88b7-41bb-981a-ce52020584c9","Type":"ContainerDied","Data":"a54a1f8b028cfd26b8ffd3d9416b4bcdd7fbca4ee11acd3f62d66c953e6970ab"} Mar 21 04:07:17 crc kubenswrapper[4685]: I0321 04:07:17.055828 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-2d98g" Mar 21 04:07:17 crc kubenswrapper[4685]: I0321 04:07:17.055886 4685 scope.go:117] "RemoveContainer" containerID="cbfb74167f21b03e55f48da2e64caaef2a70c86f9394cda21d8a5d2101b088dc" Mar 21 04:07:17 crc kubenswrapper[4685]: I0321 04:07:17.075596 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/7e3b51b6a8afd7c782c20eee88f3ff29a8534df38096cf089dfa437de6m8ksc"] Mar 21 04:07:17 crc kubenswrapper[4685]: I0321 04:07:17.094905 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/7e3b51b6a8afd7c782c20eee88f3ff29a8534df38096cf089dfa437de6m8ksc"] Mar 21 04:07:17 crc kubenswrapper[4685]: I0321 04:07:17.102876 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-index-2d98g"] Mar 21 04:07:17 crc kubenswrapper[4685]: I0321 04:07:17.111068 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/infra-operator-index-2d98g"] Mar 21 04:07:17 crc kubenswrapper[4685]: I0321 04:07:17.299723 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6697764dc7-vzzbc" Mar 21 04:07:17 crc kubenswrapper[4685]: I0321 04:07:17.355997 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-v8f9z" Mar 21 04:07:17 crc kubenswrapper[4685]: I0321 04:07:17.446237 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjm5c\" (UniqueName: \"kubernetes.io/projected/0a32f016-2085-4f24-83d7-68a6a01d1f02-kube-api-access-qjm5c\") pod \"0a32f016-2085-4f24-83d7-68a6a01d1f02\" (UID: \"0a32f016-2085-4f24-83d7-68a6a01d1f02\") " Mar 21 04:07:17 crc kubenswrapper[4685]: I0321 04:07:17.446366 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dzfsx\" (UniqueName: \"kubernetes.io/projected/3be15e47-4dd7-4e37-81ac-a3e7ec139af1-kube-api-access-dzfsx\") pod \"3be15e47-4dd7-4e37-81ac-a3e7ec139af1\" (UID: \"3be15e47-4dd7-4e37-81ac-a3e7ec139af1\") " Mar 21 04:07:17 crc kubenswrapper[4685]: I0321 04:07:17.446415 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3be15e47-4dd7-4e37-81ac-a3e7ec139af1-webhook-cert\") pod \"3be15e47-4dd7-4e37-81ac-a3e7ec139af1\" (UID: \"3be15e47-4dd7-4e37-81ac-a3e7ec139af1\") " Mar 21 04:07:17 crc kubenswrapper[4685]: I0321 04:07:17.446459 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3be15e47-4dd7-4e37-81ac-a3e7ec139af1-apiservice-cert\") pod \"3be15e47-4dd7-4e37-81ac-a3e7ec139af1\" (UID: \"3be15e47-4dd7-4e37-81ac-a3e7ec139af1\") " Mar 21 04:07:17 crc kubenswrapper[4685]: I0321 04:07:17.451627 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a32f016-2085-4f24-83d7-68a6a01d1f02-kube-api-access-qjm5c" (OuterVolumeSpecName: "kube-api-access-qjm5c") pod "0a32f016-2085-4f24-83d7-68a6a01d1f02" (UID: "0a32f016-2085-4f24-83d7-68a6a01d1f02"). InnerVolumeSpecName "kube-api-access-qjm5c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:07:17 crc kubenswrapper[4685]: I0321 04:07:17.451711 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3be15e47-4dd7-4e37-81ac-a3e7ec139af1-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "3be15e47-4dd7-4e37-81ac-a3e7ec139af1" (UID: "3be15e47-4dd7-4e37-81ac-a3e7ec139af1"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:07:17 crc kubenswrapper[4685]: I0321 04:07:17.451784 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3be15e47-4dd7-4e37-81ac-a3e7ec139af1-kube-api-access-dzfsx" (OuterVolumeSpecName: "kube-api-access-dzfsx") pod "3be15e47-4dd7-4e37-81ac-a3e7ec139af1" (UID: "3be15e47-4dd7-4e37-81ac-a3e7ec139af1"). InnerVolumeSpecName "kube-api-access-dzfsx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:07:17 crc kubenswrapper[4685]: I0321 04:07:17.451951 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3be15e47-4dd7-4e37-81ac-a3e7ec139af1-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "3be15e47-4dd7-4e37-81ac-a3e7ec139af1" (UID: "3be15e47-4dd7-4e37-81ac-a3e7ec139af1"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:07:17 crc kubenswrapper[4685]: I0321 04:07:17.548385 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dzfsx\" (UniqueName: \"kubernetes.io/projected/3be15e47-4dd7-4e37-81ac-a3e7ec139af1-kube-api-access-dzfsx\") on node \"crc\" DevicePath \"\"" Mar 21 04:07:17 crc kubenswrapper[4685]: I0321 04:07:17.548431 4685 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3be15e47-4dd7-4e37-81ac-a3e7ec139af1-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:07:17 crc kubenswrapper[4685]: I0321 04:07:17.548446 4685 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3be15e47-4dd7-4e37-81ac-a3e7ec139af1-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:07:17 crc kubenswrapper[4685]: I0321 04:07:17.548457 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjm5c\" (UniqueName: \"kubernetes.io/projected/0a32f016-2085-4f24-83d7-68a6a01d1f02-kube-api-access-qjm5c\") on node \"crc\" DevicePath \"\"" Mar 21 04:07:18 crc kubenswrapper[4685]: I0321 04:07:18.064398 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6697764dc7-vzzbc" event={"ID":"3be15e47-4dd7-4e37-81ac-a3e7ec139af1","Type":"ContainerDied","Data":"e68f673c02549104aea351f777a21b53b632feb87fbd529fac341c6f4798ec62"} Mar 21 04:07:18 crc kubenswrapper[4685]: I0321 04:07:18.064420 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6697764dc7-vzzbc" Mar 21 04:07:18 crc kubenswrapper[4685]: I0321 04:07:18.064462 4685 scope.go:117] "RemoveContainer" containerID="b5c9b4de08deb332e3fc18bfe577959d3b1748a282928fd0e7147671831d4c01" Mar 21 04:07:18 crc kubenswrapper[4685]: I0321 04:07:18.065804 4685 generic.go:334] "Generic (PLEG): container finished" podID="0a32f016-2085-4f24-83d7-68a6a01d1f02" containerID="9c4fa5b79b74d881d929682dcc86b2b780b6847ad69b900b5a0deabf5967e119" exitCode=0 Mar 21 04:07:18 crc kubenswrapper[4685]: I0321 04:07:18.065831 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-v8f9z" event={"ID":"0a32f016-2085-4f24-83d7-68a6a01d1f02","Type":"ContainerDied","Data":"9c4fa5b79b74d881d929682dcc86b2b780b6847ad69b900b5a0deabf5967e119"} Mar 21 04:07:18 crc kubenswrapper[4685]: I0321 04:07:18.065858 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-v8f9z" event={"ID":"0a32f016-2085-4f24-83d7-68a6a01d1f02","Type":"ContainerDied","Data":"12cda35357d8475a4c1eacac304a470236d20996b00d97bc5086a2e4ac8ec7f7"} Mar 21 04:07:18 crc kubenswrapper[4685]: I0321 04:07:18.065899 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-v8f9z" Mar 21 04:07:18 crc kubenswrapper[4685]: I0321 04:07:18.084394 4685 scope.go:117] "RemoveContainer" containerID="9c4fa5b79b74d881d929682dcc86b2b780b6847ad69b900b5a0deabf5967e119" Mar 21 04:07:18 crc kubenswrapper[4685]: I0321 04:07:18.091219 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-v8f9z"] Mar 21 04:07:18 crc kubenswrapper[4685]: I0321 04:07:18.095401 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/mariadb-operator-index-v8f9z"] Mar 21 04:07:18 crc kubenswrapper[4685]: I0321 04:07:18.106314 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6697764dc7-vzzbc"] Mar 21 04:07:18 crc kubenswrapper[4685]: I0321 04:07:18.107063 4685 scope.go:117] "RemoveContainer" containerID="9c4fa5b79b74d881d929682dcc86b2b780b6847ad69b900b5a0deabf5967e119" Mar 21 04:07:18 crc kubenswrapper[4685]: E0321 04:07:18.107493 4685 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c4fa5b79b74d881d929682dcc86b2b780b6847ad69b900b5a0deabf5967e119\": container with ID starting with 9c4fa5b79b74d881d929682dcc86b2b780b6847ad69b900b5a0deabf5967e119 not found: ID does not exist" containerID="9c4fa5b79b74d881d929682dcc86b2b780b6847ad69b900b5a0deabf5967e119" Mar 21 04:07:18 crc kubenswrapper[4685]: I0321 04:07:18.107520 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c4fa5b79b74d881d929682dcc86b2b780b6847ad69b900b5a0deabf5967e119"} err="failed to get container status \"9c4fa5b79b74d881d929682dcc86b2b780b6847ad69b900b5a0deabf5967e119\": rpc error: code = NotFound desc = could not find container \"9c4fa5b79b74d881d929682dcc86b2b780b6847ad69b900b5a0deabf5967e119\": container with ID starting with 9c4fa5b79b74d881d929682dcc86b2b780b6847ad69b900b5a0deabf5967e119 not found: ID does not exist" Mar 21 04:07:18 crc kubenswrapper[4685]: I0321 04:07:18.110983 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6697764dc7-vzzbc"] Mar 21 04:07:18 crc kubenswrapper[4685]: I0321 04:07:18.306300 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a32f016-2085-4f24-83d7-68a6a01d1f02" path="/var/lib/kubelet/pods/0a32f016-2085-4f24-83d7-68a6a01d1f02/volumes" Mar 21 04:07:18 crc kubenswrapper[4685]: I0321 04:07:18.306910 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1576ba2d-61cd-4dde-b4ce-eab38b64a3f2" path="/var/lib/kubelet/pods/1576ba2d-61cd-4dde-b4ce-eab38b64a3f2/volumes" Mar 21 04:07:18 crc kubenswrapper[4685]: I0321 04:07:18.307585 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2599b123-88b7-41bb-981a-ce52020584c9" path="/var/lib/kubelet/pods/2599b123-88b7-41bb-981a-ce52020584c9/volumes" Mar 21 04:07:18 crc kubenswrapper[4685]: I0321 04:07:18.308091 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3be15e47-4dd7-4e37-81ac-a3e7ec139af1" path="/var/lib/kubelet/pods/3be15e47-4dd7-4e37-81ac-a3e7ec139af1/volumes" Mar 21 04:07:22 crc kubenswrapper[4685]: E0321 04:07:22.838477 4685 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e33f8cc_f6fa_48ab_a172_74892478c268.slice/crio-e2d92cc4f8795aed53ebcd983b89eb996fa6974a422c65d6956148de1c17f3c7.scope\": RecentStats: unable to find data in memory cache]" Mar 21 04:07:28 crc kubenswrapper[4685]: I0321 04:07:28.944018 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-clkpq/must-gather-mfmmh"] Mar 21 04:07:28 crc kubenswrapper[4685]: E0321 04:07:28.944644 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a69263a8-bd4d-476c-99fc-f1202f36f8a0" containerName="galera" Mar 21 04:07:28 crc kubenswrapper[4685]: I0321 04:07:28.944660 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="a69263a8-bd4d-476c-99fc-f1202f36f8a0" containerName="galera" Mar 21 04:07:28 crc kubenswrapper[4685]: E0321 04:07:28.944675 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a69263a8-bd4d-476c-99fc-f1202f36f8a0" containerName="mysql-bootstrap" Mar 21 04:07:28 crc kubenswrapper[4685]: I0321 04:07:28.944682 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="a69263a8-bd4d-476c-99fc-f1202f36f8a0" containerName="mysql-bootstrap" Mar 21 04:07:28 crc kubenswrapper[4685]: E0321 04:07:28.944695 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50509c19-c2fa-4171-a5f8-e4d699a9062c" containerName="manager" Mar 21 04:07:28 crc kubenswrapper[4685]: I0321 04:07:28.944703 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="50509c19-c2fa-4171-a5f8-e4d699a9062c" containerName="manager" Mar 21 04:07:28 crc kubenswrapper[4685]: E0321 04:07:28.944716 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b2b4768-f546-4fad-9609-8b01fa7749dc" containerName="keystone-api" Mar 21 04:07:28 crc kubenswrapper[4685]: I0321 04:07:28.944724 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b2b4768-f546-4fad-9609-8b01fa7749dc" containerName="keystone-api" Mar 21 04:07:28 crc kubenswrapper[4685]: E0321 04:07:28.944734 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f046207-c975-4417-89f1-650002978bca" containerName="mariadb-account-delete" Mar 21 04:07:28 crc kubenswrapper[4685]: I0321 04:07:28.944741 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f046207-c975-4417-89f1-650002978bca" containerName="mariadb-account-delete" Mar 21 04:07:28 crc kubenswrapper[4685]: E0321 04:07:28.944753 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08b81240-20f5-499a-afd2-5666d0fa97e3" containerName="registry-server" Mar 21 04:07:28 crc kubenswrapper[4685]: I0321 04:07:28.944761 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="08b81240-20f5-499a-afd2-5666d0fa97e3" containerName="registry-server" Mar 21 04:07:28 crc kubenswrapper[4685]: E0321 04:07:28.944772 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91653387-c2f1-4240-b710-e0c709eb769d" containerName="memcached" Mar 21 04:07:28 crc kubenswrapper[4685]: I0321 04:07:28.944778 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="91653387-c2f1-4240-b710-e0c709eb769d" containerName="memcached" Mar 21 04:07:28 crc kubenswrapper[4685]: E0321 04:07:28.944789 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f07c1e30-7d41-456a-adb4-2d042b562bf7" containerName="registry-server" Mar 21 04:07:28 crc kubenswrapper[4685]: I0321 04:07:28.944795 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="f07c1e30-7d41-456a-adb4-2d042b562bf7" containerName="registry-server" Mar 21 04:07:28 crc kubenswrapper[4685]: E0321 04:07:28.944806 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba532d6b-607c-450f-adb7-8d4e14ff58e0" containerName="mysql-bootstrap" Mar 21 04:07:28 crc kubenswrapper[4685]: I0321 04:07:28.944812 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba532d6b-607c-450f-adb7-8d4e14ff58e0" containerName="mysql-bootstrap" Mar 21 04:07:28 crc kubenswrapper[4685]: E0321 04:07:28.944821 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5397a9a1-b670-42a8-8515-8cf15e8aa2d4" containerName="operator" Mar 21 04:07:28 crc kubenswrapper[4685]: I0321 04:07:28.944827 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="5397a9a1-b670-42a8-8515-8cf15e8aa2d4" containerName="operator" Mar 21 04:07:28 crc kubenswrapper[4685]: E0321 04:07:28.944854 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a32f016-2085-4f24-83d7-68a6a01d1f02" containerName="registry-server" Mar 21 04:07:28 crc kubenswrapper[4685]: I0321 04:07:28.944861 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a32f016-2085-4f24-83d7-68a6a01d1f02" containerName="registry-server" Mar 21 04:07:28 crc kubenswrapper[4685]: E0321 04:07:28.944872 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3be15e47-4dd7-4e37-81ac-a3e7ec139af1" containerName="manager" Mar 21 04:07:28 crc kubenswrapper[4685]: I0321 04:07:28.944878 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="3be15e47-4dd7-4e37-81ac-a3e7ec139af1" containerName="manager" Mar 21 04:07:28 crc kubenswrapper[4685]: E0321 04:07:28.944889 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2599b123-88b7-41bb-981a-ce52020584c9" containerName="registry-server" Mar 21 04:07:28 crc kubenswrapper[4685]: I0321 04:07:28.944898 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="2599b123-88b7-41bb-981a-ce52020584c9" containerName="registry-server" Mar 21 04:07:28 crc kubenswrapper[4685]: E0321 04:07:28.944905 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ee610c4-8416-4d1c-a6b4-2324f1541b1c" containerName="mysql-bootstrap" Mar 21 04:07:28 crc kubenswrapper[4685]: I0321 04:07:28.944912 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ee610c4-8416-4d1c-a6b4-2324f1541b1c" containerName="mysql-bootstrap" Mar 21 04:07:28 crc kubenswrapper[4685]: E0321 04:07:28.944922 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba532d6b-607c-450f-adb7-8d4e14ff58e0" containerName="galera" Mar 21 04:07:28 crc kubenswrapper[4685]: I0321 04:07:28.944929 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba532d6b-607c-450f-adb7-8d4e14ff58e0" containerName="galera" Mar 21 04:07:28 crc kubenswrapper[4685]: E0321 04:07:28.944939 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d18693b0-ea2a-4795-a4de-15a379cc8490" containerName="rabbitmq" Mar 21 04:07:28 crc kubenswrapper[4685]: I0321 04:07:28.944946 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="d18693b0-ea2a-4795-a4de-15a379cc8490" containerName="rabbitmq" Mar 21 04:07:28 crc kubenswrapper[4685]: E0321 04:07:28.944954 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e33f8cc-f6fa-48ab-a172-74892478c268" containerName="manager" Mar 21 04:07:28 crc kubenswrapper[4685]: I0321 04:07:28.944960 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e33f8cc-f6fa-48ab-a172-74892478c268" containerName="manager" Mar 21 04:07:28 crc kubenswrapper[4685]: E0321 04:07:28.944972 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d18693b0-ea2a-4795-a4de-15a379cc8490" containerName="setup-container" Mar 21 04:07:28 crc kubenswrapper[4685]: I0321 04:07:28.944979 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="d18693b0-ea2a-4795-a4de-15a379cc8490" containerName="setup-container" Mar 21 04:07:28 crc kubenswrapper[4685]: E0321 04:07:28.944988 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a568963-c78a-41ee-ab5a-25f5d1eb0bb5" containerName="manager" Mar 21 04:07:28 crc kubenswrapper[4685]: I0321 04:07:28.944995 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a568963-c78a-41ee-ab5a-25f5d1eb0bb5" containerName="manager" Mar 21 04:07:28 crc kubenswrapper[4685]: E0321 04:07:28.945004 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ee610c4-8416-4d1c-a6b4-2324f1541b1c" containerName="galera" Mar 21 04:07:28 crc kubenswrapper[4685]: I0321 04:07:28.945010 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ee610c4-8416-4d1c-a6b4-2324f1541b1c" containerName="galera" Mar 21 04:07:28 crc kubenswrapper[4685]: E0321 04:07:28.945019 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e92e8d0a-5e30-4648-b7b5-9b6040db75f0" containerName="registry-server" Mar 21 04:07:28 crc kubenswrapper[4685]: I0321 04:07:28.945025 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="e92e8d0a-5e30-4648-b7b5-9b6040db75f0" containerName="registry-server" Mar 21 04:07:28 crc kubenswrapper[4685]: I0321 04:07:28.945130 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="50509c19-c2fa-4171-a5f8-e4d699a9062c" containerName="manager" Mar 21 04:07:28 crc kubenswrapper[4685]: I0321 04:07:28.945140 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b2b4768-f546-4fad-9609-8b01fa7749dc" containerName="keystone-api" Mar 21 04:07:28 crc kubenswrapper[4685]: I0321 04:07:28.945150 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="3be15e47-4dd7-4e37-81ac-a3e7ec139af1" containerName="manager" Mar 21 04:07:28 crc kubenswrapper[4685]: I0321 04:07:28.945158 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba532d6b-607c-450f-adb7-8d4e14ff58e0" containerName="galera" Mar 21 04:07:28 crc kubenswrapper[4685]: I0321 04:07:28.945168 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="08b81240-20f5-499a-afd2-5666d0fa97e3" containerName="registry-server" Mar 21 04:07:28 crc kubenswrapper[4685]: I0321 04:07:28.945176 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f046207-c975-4417-89f1-650002978bca" containerName="mariadb-account-delete" Mar 21 04:07:28 crc kubenswrapper[4685]: I0321 04:07:28.945186 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="d18693b0-ea2a-4795-a4de-15a379cc8490" containerName="rabbitmq" Mar 21 04:07:28 crc kubenswrapper[4685]: I0321 04:07:28.945194 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="91653387-c2f1-4240-b710-e0c709eb769d" containerName="memcached" Mar 21 04:07:28 crc kubenswrapper[4685]: I0321 04:07:28.945204 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="a69263a8-bd4d-476c-99fc-f1202f36f8a0" containerName="galera" Mar 21 04:07:28 crc kubenswrapper[4685]: I0321 04:07:28.945212 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e33f8cc-f6fa-48ab-a172-74892478c268" containerName="manager" Mar 21 04:07:28 crc kubenswrapper[4685]: I0321 04:07:28.945221 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ee610c4-8416-4d1c-a6b4-2324f1541b1c" containerName="galera" Mar 21 04:07:28 crc kubenswrapper[4685]: I0321 04:07:28.945229 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="5397a9a1-b670-42a8-8515-8cf15e8aa2d4" containerName="operator" Mar 21 04:07:28 crc kubenswrapper[4685]: I0321 04:07:28.945240 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="e92e8d0a-5e30-4648-b7b5-9b6040db75f0" containerName="registry-server" Mar 21 04:07:28 crc kubenswrapper[4685]: I0321 04:07:28.945249 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a32f016-2085-4f24-83d7-68a6a01d1f02" containerName="registry-server" Mar 21 04:07:28 crc kubenswrapper[4685]: I0321 04:07:28.945261 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a568963-c78a-41ee-ab5a-25f5d1eb0bb5" containerName="manager" Mar 21 04:07:28 crc kubenswrapper[4685]: I0321 04:07:28.945270 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="2599b123-88b7-41bb-981a-ce52020584c9" containerName="registry-server" Mar 21 04:07:28 crc kubenswrapper[4685]: I0321 04:07:28.945277 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="f07c1e30-7d41-456a-adb4-2d042b562bf7" containerName="registry-server" Mar 21 04:07:28 crc kubenswrapper[4685]: E0321 04:07:28.945397 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f046207-c975-4417-89f1-650002978bca" containerName="mariadb-account-delete" Mar 21 04:07:28 crc kubenswrapper[4685]: I0321 04:07:28.945406 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f046207-c975-4417-89f1-650002978bca" containerName="mariadb-account-delete" Mar 21 04:07:28 crc kubenswrapper[4685]: I0321 04:07:28.945510 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f046207-c975-4417-89f1-650002978bca" containerName="mariadb-account-delete" Mar 21 04:07:28 crc kubenswrapper[4685]: I0321 04:07:28.945951 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-clkpq/must-gather-mfmmh" Mar 21 04:07:28 crc kubenswrapper[4685]: I0321 04:07:28.950824 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-clkpq"/"default-dockercfg-xmfxn" Mar 21 04:07:28 crc kubenswrapper[4685]: I0321 04:07:28.951126 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-clkpq"/"openshift-service-ca.crt" Mar 21 04:07:28 crc kubenswrapper[4685]: I0321 04:07:28.951159 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-clkpq"/"kube-root-ca.crt" Mar 21 04:07:28 crc kubenswrapper[4685]: I0321 04:07:28.966159 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-clkpq/must-gather-mfmmh"] Mar 21 04:07:29 crc kubenswrapper[4685]: I0321 04:07:29.003686 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8h8m\" (UniqueName: \"kubernetes.io/projected/b741f897-9957-47db-b5ea-3687a5a12d53-kube-api-access-f8h8m\") pod \"must-gather-mfmmh\" (UID: \"b741f897-9957-47db-b5ea-3687a5a12d53\") " pod="openshift-must-gather-clkpq/must-gather-mfmmh" Mar 21 04:07:29 crc kubenswrapper[4685]: I0321 04:07:29.003763 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b741f897-9957-47db-b5ea-3687a5a12d53-must-gather-output\") pod \"must-gather-mfmmh\" (UID: \"b741f897-9957-47db-b5ea-3687a5a12d53\") " pod="openshift-must-gather-clkpq/must-gather-mfmmh" Mar 21 04:07:29 crc kubenswrapper[4685]: I0321 04:07:29.105386 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8h8m\" (UniqueName: \"kubernetes.io/projected/b741f897-9957-47db-b5ea-3687a5a12d53-kube-api-access-f8h8m\") pod \"must-gather-mfmmh\" (UID: \"b741f897-9957-47db-b5ea-3687a5a12d53\") " pod="openshift-must-gather-clkpq/must-gather-mfmmh" Mar 21 04:07:29 crc kubenswrapper[4685]: I0321 04:07:29.105656 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b741f897-9957-47db-b5ea-3687a5a12d53-must-gather-output\") pod \"must-gather-mfmmh\" (UID: \"b741f897-9957-47db-b5ea-3687a5a12d53\") " pod="openshift-must-gather-clkpq/must-gather-mfmmh" Mar 21 04:07:29 crc kubenswrapper[4685]: I0321 04:07:29.106076 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b741f897-9957-47db-b5ea-3687a5a12d53-must-gather-output\") pod \"must-gather-mfmmh\" (UID: \"b741f897-9957-47db-b5ea-3687a5a12d53\") " pod="openshift-must-gather-clkpq/must-gather-mfmmh" Mar 21 04:07:29 crc kubenswrapper[4685]: I0321 04:07:29.129112 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8h8m\" (UniqueName: \"kubernetes.io/projected/b741f897-9957-47db-b5ea-3687a5a12d53-kube-api-access-f8h8m\") pod \"must-gather-mfmmh\" (UID: \"b741f897-9957-47db-b5ea-3687a5a12d53\") " pod="openshift-must-gather-clkpq/must-gather-mfmmh" Mar 21 04:07:29 crc kubenswrapper[4685]: I0321 04:07:29.264355 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-clkpq/must-gather-mfmmh" Mar 21 04:07:29 crc kubenswrapper[4685]: I0321 04:07:29.700413 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-clkpq/must-gather-mfmmh"] Mar 21 04:07:29 crc kubenswrapper[4685]: I0321 04:07:29.719256 4685 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 21 04:07:30 crc kubenswrapper[4685]: I0321 04:07:30.167829 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-clkpq/must-gather-mfmmh" event={"ID":"b741f897-9957-47db-b5ea-3687a5a12d53","Type":"ContainerStarted","Data":"27a2228a1c8053c4097ed30a78477e4f1b6f560f2ba452a751c2430b5da4df04"} Mar 21 04:07:32 crc kubenswrapper[4685]: E0321 04:07:32.954029 4685 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e33f8cc_f6fa_48ab_a172_74892478c268.slice/crio-e2d92cc4f8795aed53ebcd983b89eb996fa6974a422c65d6956148de1c17f3c7.scope\": RecentStats: unable to find data in memory cache]" Mar 21 04:07:34 crc kubenswrapper[4685]: I0321 04:07:34.225061 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-clkpq/must-gather-mfmmh" event={"ID":"b741f897-9957-47db-b5ea-3687a5a12d53","Type":"ContainerStarted","Data":"44f969cb14e9d45dbd3d5b768ea5acaa48fcc4353c525de599ba6dbe5caa1e7f"} Mar 21 04:07:34 crc kubenswrapper[4685]: I0321 04:07:34.225424 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-clkpq/must-gather-mfmmh" event={"ID":"b741f897-9957-47db-b5ea-3687a5a12d53","Type":"ContainerStarted","Data":"57423b2d9cf7ff8da6ba809bb6110bea3a95eb85198af4d46eff3cfc85443b46"} Mar 21 04:07:34 crc kubenswrapper[4685]: I0321 04:07:34.246855 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-clkpq/must-gather-mfmmh" podStartSLOduration=2.932143031 podStartE2EDuration="6.246821725s" podCreationTimestamp="2026-03-21 04:07:28 +0000 UTC" firstStartedPulling="2026-03-21 04:07:29.718847598 +0000 UTC m=+1282.195916400" lastFinishedPulling="2026-03-21 04:07:33.033526302 +0000 UTC m=+1285.510595094" observedRunningTime="2026-03-21 04:07:34.242299498 +0000 UTC m=+1286.719368300" watchObservedRunningTime="2026-03-21 04:07:34.246821725 +0000 UTC m=+1286.723890517" Mar 21 04:07:38 crc kubenswrapper[4685]: I0321 04:07:38.490956 4685 scope.go:117] "RemoveContainer" containerID="a70e6b6534a9f9dd0ca29ac76fb7b519a688f958c1e1792d75fc98b6116e8d58" Mar 21 04:07:38 crc kubenswrapper[4685]: I0321 04:07:38.515423 4685 scope.go:117] "RemoveContainer" containerID="dba5e2417eade1eaaa37fce7e18ce3624852e3e52aa73be129bce881364f51ac" Mar 21 04:07:38 crc kubenswrapper[4685]: I0321 04:07:38.562621 4685 scope.go:117] "RemoveContainer" containerID="59c344cffa029dbb85f2621534d4872f749153ca1d3368b50a888fa846e9a355" Mar 21 04:07:43 crc kubenswrapper[4685]: E0321 04:07:43.094164 4685 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e33f8cc_f6fa_48ab_a172_74892478c268.slice/crio-e2d92cc4f8795aed53ebcd983b89eb996fa6974a422c65d6956148de1c17f3c7.scope\": RecentStats: unable to find data in memory cache]" Mar 21 04:07:53 crc kubenswrapper[4685]: E0321 04:07:53.214925 4685 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e33f8cc_f6fa_48ab_a172_74892478c268.slice/crio-e2d92cc4f8795aed53ebcd983b89eb996fa6974a422c65d6956148de1c17f3c7.scope\": RecentStats: unable to find data in memory cache]" Mar 21 04:08:00 crc kubenswrapper[4685]: I0321 04:08:00.149465 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567768-z5dsj"] Mar 21 04:08:00 crc kubenswrapper[4685]: I0321 04:08:00.151254 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567768-z5dsj" Mar 21 04:08:00 crc kubenswrapper[4685]: I0321 04:08:00.154911 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k75cc" Mar 21 04:08:00 crc kubenswrapper[4685]: I0321 04:08:00.155567 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 04:08:00 crc kubenswrapper[4685]: I0321 04:08:00.156880 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567768-z5dsj"] Mar 21 04:08:00 crc kubenswrapper[4685]: I0321 04:08:00.160403 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 04:08:00 crc kubenswrapper[4685]: I0321 04:08:00.179405 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6tjq\" (UniqueName: \"kubernetes.io/projected/eaa4eac7-8489-46a3-a666-7794ef9be68d-kube-api-access-s6tjq\") pod \"auto-csr-approver-29567768-z5dsj\" (UID: \"eaa4eac7-8489-46a3-a666-7794ef9be68d\") " pod="openshift-infra/auto-csr-approver-29567768-z5dsj" Mar 21 04:08:00 crc kubenswrapper[4685]: I0321 04:08:00.281314 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6tjq\" (UniqueName: \"kubernetes.io/projected/eaa4eac7-8489-46a3-a666-7794ef9be68d-kube-api-access-s6tjq\") pod \"auto-csr-approver-29567768-z5dsj\" (UID: \"eaa4eac7-8489-46a3-a666-7794ef9be68d\") " pod="openshift-infra/auto-csr-approver-29567768-z5dsj" Mar 21 04:08:00 crc kubenswrapper[4685]: I0321 04:08:00.308783 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6tjq\" (UniqueName: \"kubernetes.io/projected/eaa4eac7-8489-46a3-a666-7794ef9be68d-kube-api-access-s6tjq\") pod \"auto-csr-approver-29567768-z5dsj\" (UID: \"eaa4eac7-8489-46a3-a666-7794ef9be68d\") " pod="openshift-infra/auto-csr-approver-29567768-z5dsj" Mar 21 04:08:00 crc kubenswrapper[4685]: I0321 04:08:00.480539 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567768-z5dsj" Mar 21 04:08:00 crc kubenswrapper[4685]: I0321 04:08:00.867124 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567768-z5dsj"] Mar 21 04:08:01 crc kubenswrapper[4685]: I0321 04:08:01.412298 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567768-z5dsj" event={"ID":"eaa4eac7-8489-46a3-a666-7794ef9be68d","Type":"ContainerStarted","Data":"c29c355d890d190880186b1e490b14c74d21bcb26d84993c7cb0c0d19618f461"} Mar 21 04:08:02 crc kubenswrapper[4685]: I0321 04:08:02.419971 4685 generic.go:334] "Generic (PLEG): container finished" podID="eaa4eac7-8489-46a3-a666-7794ef9be68d" containerID="888486b10a5a1c291db786c174630f8cb84a6c68276f6917d0e8e43c51417d19" exitCode=0 Mar 21 04:08:02 crc kubenswrapper[4685]: I0321 04:08:02.420015 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567768-z5dsj" event={"ID":"eaa4eac7-8489-46a3-a666-7794ef9be68d","Type":"ContainerDied","Data":"888486b10a5a1c291db786c174630f8cb84a6c68276f6917d0e8e43c51417d19"} Mar 21 04:08:03 crc kubenswrapper[4685]: E0321 04:08:03.341374 4685 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e33f8cc_f6fa_48ab_a172_74892478c268.slice/crio-e2d92cc4f8795aed53ebcd983b89eb996fa6974a422c65d6956148de1c17f3c7.scope\": RecentStats: unable to find data in memory cache]" Mar 21 04:08:03 crc kubenswrapper[4685]: I0321 04:08:03.659272 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567768-z5dsj" Mar 21 04:08:03 crc kubenswrapper[4685]: I0321 04:08:03.719223 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s6tjq\" (UniqueName: \"kubernetes.io/projected/eaa4eac7-8489-46a3-a666-7794ef9be68d-kube-api-access-s6tjq\") pod \"eaa4eac7-8489-46a3-a666-7794ef9be68d\" (UID: \"eaa4eac7-8489-46a3-a666-7794ef9be68d\") " Mar 21 04:08:03 crc kubenswrapper[4685]: I0321 04:08:03.725306 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eaa4eac7-8489-46a3-a666-7794ef9be68d-kube-api-access-s6tjq" (OuterVolumeSpecName: "kube-api-access-s6tjq") pod "eaa4eac7-8489-46a3-a666-7794ef9be68d" (UID: "eaa4eac7-8489-46a3-a666-7794ef9be68d"). InnerVolumeSpecName "kube-api-access-s6tjq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:08:03 crc kubenswrapper[4685]: I0321 04:08:03.820536 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s6tjq\" (UniqueName: \"kubernetes.io/projected/eaa4eac7-8489-46a3-a666-7794ef9be68d-kube-api-access-s6tjq\") on node \"crc\" DevicePath \"\"" Mar 21 04:08:04 crc kubenswrapper[4685]: I0321 04:08:04.434019 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567768-z5dsj" event={"ID":"eaa4eac7-8489-46a3-a666-7794ef9be68d","Type":"ContainerDied","Data":"c29c355d890d190880186b1e490b14c74d21bcb26d84993c7cb0c0d19618f461"} Mar 21 04:08:04 crc kubenswrapper[4685]: I0321 04:08:04.434061 4685 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c29c355d890d190880186b1e490b14c74d21bcb26d84993c7cb0c0d19618f461" Mar 21 04:08:04 crc kubenswrapper[4685]: I0321 04:08:04.434115 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567768-z5dsj" Mar 21 04:08:04 crc kubenswrapper[4685]: I0321 04:08:04.726518 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567762-7g776"] Mar 21 04:08:04 crc kubenswrapper[4685]: I0321 04:08:04.729895 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567762-7g776"] Mar 21 04:08:06 crc kubenswrapper[4685]: I0321 04:08:06.308736 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1cc3737-5c39-4975-af9a-f403efa1e0b7" path="/var/lib/kubelet/pods/e1cc3737-5c39-4975-af9a-f403efa1e0b7/volumes" Mar 21 04:08:20 crc kubenswrapper[4685]: I0321 04:08:20.353712 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-646sv_3f819915-64a0-4327-ac01-5ff842cbc592/control-plane-machine-set-operator/0.log" Mar 21 04:08:20 crc kubenswrapper[4685]: I0321 04:08:20.456985 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-flxks_1ee95c71-cb75-4357-aeff-c0417a0c6eb3/kube-rbac-proxy/0.log" Mar 21 04:08:20 crc kubenswrapper[4685]: I0321 04:08:20.504942 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-flxks_1ee95c71-cb75-4357-aeff-c0417a0c6eb3/machine-api-operator/0.log" Mar 21 04:08:38 crc kubenswrapper[4685]: I0321 04:08:38.794201 4685 scope.go:117] "RemoveContainer" containerID="e1a2e139b19c8746eee0fb0420f2322d3ae267c77164a3e29764c388194f4305" Mar 21 04:08:38 crc kubenswrapper[4685]: I0321 04:08:38.810505 4685 scope.go:117] "RemoveContainer" containerID="976b2eff86b07bc03d8c0610b631694b641b44bb86692d56bf5953f1c5ee2337" Mar 21 04:08:38 crc kubenswrapper[4685]: I0321 04:08:38.844264 4685 scope.go:117] "RemoveContainer" containerID="7acbb1263af799a95715386bddb8f2b759c9c96f371dd50aa9df8942e78be9b4" Mar 21 04:08:38 crc kubenswrapper[4685]: I0321 04:08:38.859748 4685 scope.go:117] "RemoveContainer" containerID="292f6eff2ae40856013b4696f5aebcf7f360d67a6d20fb023657731f90209e5b" Mar 21 04:08:39 crc kubenswrapper[4685]: I0321 04:08:39.685180 4685 patch_prober.go:28] interesting pod/machine-config-daemon-7r9cg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 04:08:39 crc kubenswrapper[4685]: I0321 04:08:39.685252 4685 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" podUID="cea46fe2-4e41-43ab-a069-cb30fb4e732c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 04:08:47 crc kubenswrapper[4685]: I0321 04:08:47.753798 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-w4n8h_76cca999-b151-46c4-b61b-b6249d75e2f5/kube-rbac-proxy/0.log" Mar 21 04:08:47 crc kubenswrapper[4685]: I0321 04:08:47.776431 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-w4n8h_76cca999-b151-46c4-b61b-b6249d75e2f5/controller/0.log" Mar 21 04:08:47 crc kubenswrapper[4685]: I0321 04:08:47.920324 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tsjrs_7e226197-b1ce-49a2-a3b9-5aed3d774a12/cp-frr-files/0.log" Mar 21 04:08:48 crc kubenswrapper[4685]: I0321 04:08:48.090617 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tsjrs_7e226197-b1ce-49a2-a3b9-5aed3d774a12/cp-frr-files/0.log" Mar 21 04:08:48 crc kubenswrapper[4685]: I0321 04:08:48.096639 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tsjrs_7e226197-b1ce-49a2-a3b9-5aed3d774a12/cp-reloader/0.log" Mar 21 04:08:48 crc kubenswrapper[4685]: I0321 04:08:48.100343 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tsjrs_7e226197-b1ce-49a2-a3b9-5aed3d774a12/cp-metrics/0.log" Mar 21 04:08:48 crc kubenswrapper[4685]: I0321 04:08:48.142560 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tsjrs_7e226197-b1ce-49a2-a3b9-5aed3d774a12/cp-reloader/0.log" Mar 21 04:08:48 crc kubenswrapper[4685]: I0321 04:08:48.268588 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tsjrs_7e226197-b1ce-49a2-a3b9-5aed3d774a12/cp-metrics/0.log" Mar 21 04:08:48 crc kubenswrapper[4685]: I0321 04:08:48.281436 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tsjrs_7e226197-b1ce-49a2-a3b9-5aed3d774a12/cp-frr-files/0.log" Mar 21 04:08:48 crc kubenswrapper[4685]: I0321 04:08:48.298056 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tsjrs_7e226197-b1ce-49a2-a3b9-5aed3d774a12/cp-metrics/0.log" Mar 21 04:08:48 crc kubenswrapper[4685]: I0321 04:08:48.324904 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tsjrs_7e226197-b1ce-49a2-a3b9-5aed3d774a12/cp-reloader/0.log" Mar 21 04:08:48 crc kubenswrapper[4685]: I0321 04:08:48.470430 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tsjrs_7e226197-b1ce-49a2-a3b9-5aed3d774a12/cp-reloader/0.log" Mar 21 04:08:48 crc kubenswrapper[4685]: I0321 04:08:48.476616 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tsjrs_7e226197-b1ce-49a2-a3b9-5aed3d774a12/cp-frr-files/0.log" Mar 21 04:08:48 crc kubenswrapper[4685]: I0321 04:08:48.479470 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tsjrs_7e226197-b1ce-49a2-a3b9-5aed3d774a12/cp-metrics/0.log" Mar 21 04:08:48 crc kubenswrapper[4685]: I0321 04:08:48.497477 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tsjrs_7e226197-b1ce-49a2-a3b9-5aed3d774a12/controller/0.log" Mar 21 04:08:48 crc kubenswrapper[4685]: I0321 04:08:48.630513 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tsjrs_7e226197-b1ce-49a2-a3b9-5aed3d774a12/kube-rbac-proxy/0.log" Mar 21 04:08:48 crc kubenswrapper[4685]: I0321 04:08:48.673862 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tsjrs_7e226197-b1ce-49a2-a3b9-5aed3d774a12/kube-rbac-proxy-frr/0.log" Mar 21 04:08:48 crc kubenswrapper[4685]: I0321 04:08:48.702120 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tsjrs_7e226197-b1ce-49a2-a3b9-5aed3d774a12/frr-metrics/0.log" Mar 21 04:08:48 crc kubenswrapper[4685]: I0321 04:08:48.840707 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tsjrs_7e226197-b1ce-49a2-a3b9-5aed3d774a12/reloader/0.log" Mar 21 04:08:48 crc kubenswrapper[4685]: I0321 04:08:48.918592 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-rjxbv_545a6f92-59ae-4ffb-824d-e493044c0082/frr-k8s-webhook-server/0.log" Mar 21 04:08:49 crc kubenswrapper[4685]: I0321 04:08:49.017580 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-9b7d5d78b-jx8nv_ed0eadeb-865c-4742-b429-5f8e0bd67f2b/manager/0.log" Mar 21 04:08:49 crc kubenswrapper[4685]: I0321 04:08:49.091925 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tsjrs_7e226197-b1ce-49a2-a3b9-5aed3d774a12/frr/0.log" Mar 21 04:08:49 crc kubenswrapper[4685]: I0321 04:08:49.195188 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-654575f8df-qj9tz_f2976724-903c-43f4-b917-da8a483a2e9e/webhook-server/0.log" Mar 21 04:08:49 crc kubenswrapper[4685]: I0321 04:08:49.254029 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-cd4d4_dc9a0be8-5b8c-43d8-a670-06541535d7a0/kube-rbac-proxy/0.log" Mar 21 04:08:49 crc kubenswrapper[4685]: I0321 04:08:49.400152 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-cd4d4_dc9a0be8-5b8c-43d8-a670-06541535d7a0/speaker/0.log" Mar 21 04:09:09 crc kubenswrapper[4685]: I0321 04:09:09.685949 4685 patch_prober.go:28] interesting pod/machine-config-daemon-7r9cg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 04:09:09 crc kubenswrapper[4685]: I0321 04:09:09.686568 4685 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" podUID="cea46fe2-4e41-43ab-a069-cb30fb4e732c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 04:09:12 crc kubenswrapper[4685]: I0321 04:09:12.169606 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rf7bs_8cbeffba-f99c-488d-b0db-1cf3b8e31823/util/0.log" Mar 21 04:09:12 crc kubenswrapper[4685]: I0321 04:09:12.821586 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rf7bs_8cbeffba-f99c-488d-b0db-1cf3b8e31823/pull/0.log" Mar 21 04:09:12 crc kubenswrapper[4685]: I0321 04:09:12.831300 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rf7bs_8cbeffba-f99c-488d-b0db-1cf3b8e31823/pull/0.log" Mar 21 04:09:12 crc kubenswrapper[4685]: I0321 04:09:12.970040 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rf7bs_8cbeffba-f99c-488d-b0db-1cf3b8e31823/util/0.log" Mar 21 04:09:13 crc kubenswrapper[4685]: I0321 04:09:13.089984 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rf7bs_8cbeffba-f99c-488d-b0db-1cf3b8e31823/util/0.log" Mar 21 04:09:13 crc kubenswrapper[4685]: I0321 04:09:13.097683 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rf7bs_8cbeffba-f99c-488d-b0db-1cf3b8e31823/extract/0.log" Mar 21 04:09:13 crc kubenswrapper[4685]: I0321 04:09:13.109381 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rf7bs_8cbeffba-f99c-488d-b0db-1cf3b8e31823/pull/0.log" Mar 21 04:09:13 crc kubenswrapper[4685]: I0321 04:09:13.250107 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-h85m6_43541153-b685-4759-bedf-261b2936431d/extract-utilities/0.log" Mar 21 04:09:13 crc kubenswrapper[4685]: I0321 04:09:13.451688 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-h85m6_43541153-b685-4759-bedf-261b2936431d/extract-content/0.log" Mar 21 04:09:13 crc kubenswrapper[4685]: I0321 04:09:13.456159 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-h85m6_43541153-b685-4759-bedf-261b2936431d/extract-utilities/0.log" Mar 21 04:09:13 crc kubenswrapper[4685]: I0321 04:09:13.465339 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-h85m6_43541153-b685-4759-bedf-261b2936431d/extract-content/0.log" Mar 21 04:09:13 crc kubenswrapper[4685]: I0321 04:09:13.611800 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-h85m6_43541153-b685-4759-bedf-261b2936431d/extract-utilities/0.log" Mar 21 04:09:13 crc kubenswrapper[4685]: I0321 04:09:13.612085 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-h85m6_43541153-b685-4759-bedf-261b2936431d/extract-content/0.log" Mar 21 04:09:13 crc kubenswrapper[4685]: I0321 04:09:13.774340 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gn69w_8038baba-1420-4797-9752-5490c0940929/extract-utilities/0.log" Mar 21 04:09:13 crc kubenswrapper[4685]: I0321 04:09:13.942074 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-h85m6_43541153-b685-4759-bedf-261b2936431d/registry-server/0.log" Mar 21 04:09:13 crc kubenswrapper[4685]: I0321 04:09:13.957977 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gn69w_8038baba-1420-4797-9752-5490c0940929/extract-utilities/0.log" Mar 21 04:09:13 crc kubenswrapper[4685]: I0321 04:09:13.977694 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gn69w_8038baba-1420-4797-9752-5490c0940929/extract-content/0.log" Mar 21 04:09:13 crc kubenswrapper[4685]: I0321 04:09:13.986730 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gn69w_8038baba-1420-4797-9752-5490c0940929/extract-content/0.log" Mar 21 04:09:14 crc kubenswrapper[4685]: I0321 04:09:14.147202 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gn69w_8038baba-1420-4797-9752-5490c0940929/extract-utilities/0.log" Mar 21 04:09:14 crc kubenswrapper[4685]: I0321 04:09:14.175238 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gn69w_8038baba-1420-4797-9752-5490c0940929/extract-content/0.log" Mar 21 04:09:14 crc kubenswrapper[4685]: I0321 04:09:14.384873 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-h9g9c_1efd0452-eb45-4336-a0eb-2e171d3da229/marketplace-operator/0.log" Mar 21 04:09:14 crc kubenswrapper[4685]: I0321 04:09:14.451552 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gn69w_8038baba-1420-4797-9752-5490c0940929/registry-server/0.log" Mar 21 04:09:14 crc kubenswrapper[4685]: I0321 04:09:14.469604 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bcc52_c0edc692-b945-418e-8d2e-129f9c88644e/extract-utilities/0.log" Mar 21 04:09:14 crc kubenswrapper[4685]: I0321 04:09:14.569005 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bcc52_c0edc692-b945-418e-8d2e-129f9c88644e/extract-utilities/0.log" Mar 21 04:09:14 crc kubenswrapper[4685]: I0321 04:09:14.572984 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bcc52_c0edc692-b945-418e-8d2e-129f9c88644e/extract-content/0.log" Mar 21 04:09:14 crc kubenswrapper[4685]: I0321 04:09:14.629664 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bcc52_c0edc692-b945-418e-8d2e-129f9c88644e/extract-content/0.log" Mar 21 04:09:14 crc kubenswrapper[4685]: I0321 04:09:14.793770 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bcc52_c0edc692-b945-418e-8d2e-129f9c88644e/extract-utilities/0.log" Mar 21 04:09:14 crc kubenswrapper[4685]: I0321 04:09:14.819427 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bcc52_c0edc692-b945-418e-8d2e-129f9c88644e/extract-content/0.log" Mar 21 04:09:14 crc kubenswrapper[4685]: I0321 04:09:14.941617 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bcc52_c0edc692-b945-418e-8d2e-129f9c88644e/registry-server/0.log" Mar 21 04:09:14 crc kubenswrapper[4685]: I0321 04:09:14.988204 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ttxzc_5118e92f-b64a-4a3b-b9e7-3902c745dbdd/extract-utilities/0.log" Mar 21 04:09:15 crc kubenswrapper[4685]: I0321 04:09:15.104472 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ttxzc_5118e92f-b64a-4a3b-b9e7-3902c745dbdd/extract-content/0.log" Mar 21 04:09:15 crc kubenswrapper[4685]: I0321 04:09:15.105500 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ttxzc_5118e92f-b64a-4a3b-b9e7-3902c745dbdd/extract-utilities/0.log" Mar 21 04:09:15 crc kubenswrapper[4685]: I0321 04:09:15.122677 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ttxzc_5118e92f-b64a-4a3b-b9e7-3902c745dbdd/extract-content/0.log" Mar 21 04:09:15 crc kubenswrapper[4685]: I0321 04:09:15.274120 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ttxzc_5118e92f-b64a-4a3b-b9e7-3902c745dbdd/extract-utilities/0.log" Mar 21 04:09:15 crc kubenswrapper[4685]: I0321 04:09:15.314070 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ttxzc_5118e92f-b64a-4a3b-b9e7-3902c745dbdd/extract-content/0.log" Mar 21 04:09:15 crc kubenswrapper[4685]: I0321 04:09:15.601921 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ttxzc_5118e92f-b64a-4a3b-b9e7-3902c745dbdd/registry-server/0.log" Mar 21 04:09:38 crc kubenswrapper[4685]: I0321 04:09:38.930973 4685 scope.go:117] "RemoveContainer" containerID="e0711fad77f15547cf76fbed59b0298329ab898dd1c5ba6eaa30e3a4802b5499" Mar 21 04:09:38 crc kubenswrapper[4685]: I0321 04:09:38.980621 4685 scope.go:117] "RemoveContainer" containerID="58e207a688cfbd25650183c8c86f302428d2c16db293af38dfde5313bac735c2" Mar 21 04:09:39 crc kubenswrapper[4685]: I0321 04:09:39.012930 4685 scope.go:117] "RemoveContainer" containerID="bd057d98b9b981c88d0cd15d750d56676000ca2649004b19bf75f9409685033f" Mar 21 04:09:39 crc kubenswrapper[4685]: I0321 04:09:39.038016 4685 scope.go:117] "RemoveContainer" containerID="e8000909b4890418d0431be2bda32cbf14a20b8de4e545501e17052be394ac67" Mar 21 04:09:39 crc kubenswrapper[4685]: I0321 04:09:39.058445 4685 scope.go:117] "RemoveContainer" containerID="1a79bccbf11405af6049a2b18f3f34e8ab927b7532b1d8a690ab6978034e65de" Mar 21 04:09:39 crc kubenswrapper[4685]: I0321 04:09:39.075437 4685 scope.go:117] "RemoveContainer" containerID="62f4399fd4930f3ab0ae55d2fcc04d98910e30189611688a706f8f63a35fbae3" Mar 21 04:09:39 crc kubenswrapper[4685]: I0321 04:09:39.097053 4685 scope.go:117] "RemoveContainer" containerID="afe8072d40f41f86cd8e9ca3876885870fa9135be27eb3bf9c6892869419732a" Mar 21 04:09:39 crc kubenswrapper[4685]: I0321 04:09:39.685990 4685 patch_prober.go:28] interesting pod/machine-config-daemon-7r9cg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 04:09:39 crc kubenswrapper[4685]: I0321 04:09:39.686099 4685 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" podUID="cea46fe2-4e41-43ab-a069-cb30fb4e732c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 04:09:39 crc kubenswrapper[4685]: I0321 04:09:39.686162 4685 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" Mar 21 04:09:39 crc kubenswrapper[4685]: I0321 04:09:39.686985 4685 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"855883c827cd8a38d55f90b9086e8832f325f74b077f5a71a8a2d2ad0a467f7f"} pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 21 04:09:39 crc kubenswrapper[4685]: I0321 04:09:39.687088 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" podUID="cea46fe2-4e41-43ab-a069-cb30fb4e732c" containerName="machine-config-daemon" containerID="cri-o://855883c827cd8a38d55f90b9086e8832f325f74b077f5a71a8a2d2ad0a467f7f" gracePeriod=600 Mar 21 04:09:40 crc kubenswrapper[4685]: I0321 04:09:40.408757 4685 generic.go:334] "Generic (PLEG): container finished" podID="cea46fe2-4e41-43ab-a069-cb30fb4e732c" containerID="855883c827cd8a38d55f90b9086e8832f325f74b077f5a71a8a2d2ad0a467f7f" exitCode=0 Mar 21 04:09:40 crc kubenswrapper[4685]: I0321 04:09:40.408849 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" event={"ID":"cea46fe2-4e41-43ab-a069-cb30fb4e732c","Type":"ContainerDied","Data":"855883c827cd8a38d55f90b9086e8832f325f74b077f5a71a8a2d2ad0a467f7f"} Mar 21 04:09:40 crc kubenswrapper[4685]: I0321 04:09:40.409298 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" event={"ID":"cea46fe2-4e41-43ab-a069-cb30fb4e732c","Type":"ContainerStarted","Data":"d777c9ea0ed74e12c330732b76bb70566fa8eb4260f58ef17bc4882bafb2cbc6"} Mar 21 04:09:40 crc kubenswrapper[4685]: I0321 04:09:40.409320 4685 scope.go:117] "RemoveContainer" containerID="fc184a6e763e19dcb85e7464f3adc7dbb9e9291d839d9e13c38f6aed20771d12" Mar 21 04:10:00 crc kubenswrapper[4685]: I0321 04:10:00.148729 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567770-9677p"] Mar 21 04:10:00 crc kubenswrapper[4685]: E0321 04:10:00.150981 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaa4eac7-8489-46a3-a666-7794ef9be68d" containerName="oc" Mar 21 04:10:00 crc kubenswrapper[4685]: I0321 04:10:00.151189 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaa4eac7-8489-46a3-a666-7794ef9be68d" containerName="oc" Mar 21 04:10:00 crc kubenswrapper[4685]: I0321 04:10:00.151512 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="eaa4eac7-8489-46a3-a666-7794ef9be68d" containerName="oc" Mar 21 04:10:00 crc kubenswrapper[4685]: I0321 04:10:00.152105 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567770-9677p" Mar 21 04:10:00 crc kubenswrapper[4685]: I0321 04:10:00.154860 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k75cc" Mar 21 04:10:00 crc kubenswrapper[4685]: I0321 04:10:00.155343 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567770-9677p"] Mar 21 04:10:00 crc kubenswrapper[4685]: I0321 04:10:00.158047 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 04:10:00 crc kubenswrapper[4685]: I0321 04:10:00.158194 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 04:10:00 crc kubenswrapper[4685]: I0321 04:10:00.257374 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-222b9\" (UniqueName: \"kubernetes.io/projected/75432731-b3b0-48bb-a0b9-6985398ddaf5-kube-api-access-222b9\") pod \"auto-csr-approver-29567770-9677p\" (UID: \"75432731-b3b0-48bb-a0b9-6985398ddaf5\") " pod="openshift-infra/auto-csr-approver-29567770-9677p" Mar 21 04:10:00 crc kubenswrapper[4685]: I0321 04:10:00.359216 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-222b9\" (UniqueName: \"kubernetes.io/projected/75432731-b3b0-48bb-a0b9-6985398ddaf5-kube-api-access-222b9\") pod \"auto-csr-approver-29567770-9677p\" (UID: \"75432731-b3b0-48bb-a0b9-6985398ddaf5\") " pod="openshift-infra/auto-csr-approver-29567770-9677p" Mar 21 04:10:00 crc kubenswrapper[4685]: I0321 04:10:00.392027 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-222b9\" (UniqueName: \"kubernetes.io/projected/75432731-b3b0-48bb-a0b9-6985398ddaf5-kube-api-access-222b9\") pod \"auto-csr-approver-29567770-9677p\" (UID: \"75432731-b3b0-48bb-a0b9-6985398ddaf5\") " pod="openshift-infra/auto-csr-approver-29567770-9677p" Mar 21 04:10:00 crc kubenswrapper[4685]: I0321 04:10:00.482208 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567770-9677p" Mar 21 04:10:00 crc kubenswrapper[4685]: I0321 04:10:00.907706 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567770-9677p"] Mar 21 04:10:01 crc kubenswrapper[4685]: I0321 04:10:01.575366 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567770-9677p" event={"ID":"75432731-b3b0-48bb-a0b9-6985398ddaf5","Type":"ContainerStarted","Data":"5d5db3ad21d2068770bafe7aa2fdf5056758c2024a6cba3753408c5675b956af"} Mar 21 04:10:02 crc kubenswrapper[4685]: I0321 04:10:02.582179 4685 generic.go:334] "Generic (PLEG): container finished" podID="75432731-b3b0-48bb-a0b9-6985398ddaf5" containerID="3509c24ae14ab73179aa9701880d9ee9ca702785c2d16217cd4102275b1d6d17" exitCode=0 Mar 21 04:10:02 crc kubenswrapper[4685]: I0321 04:10:02.582427 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567770-9677p" event={"ID":"75432731-b3b0-48bb-a0b9-6985398ddaf5","Type":"ContainerDied","Data":"3509c24ae14ab73179aa9701880d9ee9ca702785c2d16217cd4102275b1d6d17"} Mar 21 04:10:03 crc kubenswrapper[4685]: I0321 04:10:03.857169 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567770-9677p" Mar 21 04:10:03 crc kubenswrapper[4685]: I0321 04:10:03.911415 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-222b9\" (UniqueName: \"kubernetes.io/projected/75432731-b3b0-48bb-a0b9-6985398ddaf5-kube-api-access-222b9\") pod \"75432731-b3b0-48bb-a0b9-6985398ddaf5\" (UID: \"75432731-b3b0-48bb-a0b9-6985398ddaf5\") " Mar 21 04:10:03 crc kubenswrapper[4685]: I0321 04:10:03.916584 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75432731-b3b0-48bb-a0b9-6985398ddaf5-kube-api-access-222b9" (OuterVolumeSpecName: "kube-api-access-222b9") pod "75432731-b3b0-48bb-a0b9-6985398ddaf5" (UID: "75432731-b3b0-48bb-a0b9-6985398ddaf5"). InnerVolumeSpecName "kube-api-access-222b9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:10:04 crc kubenswrapper[4685]: I0321 04:10:04.013310 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-222b9\" (UniqueName: \"kubernetes.io/projected/75432731-b3b0-48bb-a0b9-6985398ddaf5-kube-api-access-222b9\") on node \"crc\" DevicePath \"\"" Mar 21 04:10:04 crc kubenswrapper[4685]: I0321 04:10:04.605411 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567770-9677p" event={"ID":"75432731-b3b0-48bb-a0b9-6985398ddaf5","Type":"ContainerDied","Data":"5d5db3ad21d2068770bafe7aa2fdf5056758c2024a6cba3753408c5675b956af"} Mar 21 04:10:04 crc kubenswrapper[4685]: I0321 04:10:04.605741 4685 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d5db3ad21d2068770bafe7aa2fdf5056758c2024a6cba3753408c5675b956af" Mar 21 04:10:04 crc kubenswrapper[4685]: I0321 04:10:04.605817 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567770-9677p" Mar 21 04:10:04 crc kubenswrapper[4685]: I0321 04:10:04.927035 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567764-pqzcc"] Mar 21 04:10:04 crc kubenswrapper[4685]: I0321 04:10:04.936340 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567764-pqzcc"] Mar 21 04:10:06 crc kubenswrapper[4685]: I0321 04:10:06.309999 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f957d4c-d9e0-4c92-ac81-bd5bdab751ad" path="/var/lib/kubelet/pods/0f957d4c-d9e0-4c92-ac81-bd5bdab751ad/volumes" Mar 21 04:10:27 crc kubenswrapper[4685]: I0321 04:10:27.764387 4685 generic.go:334] "Generic (PLEG): container finished" podID="b741f897-9957-47db-b5ea-3687a5a12d53" containerID="57423b2d9cf7ff8da6ba809bb6110bea3a95eb85198af4d46eff3cfc85443b46" exitCode=0 Mar 21 04:10:27 crc kubenswrapper[4685]: I0321 04:10:27.764519 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-clkpq/must-gather-mfmmh" event={"ID":"b741f897-9957-47db-b5ea-3687a5a12d53","Type":"ContainerDied","Data":"57423b2d9cf7ff8da6ba809bb6110bea3a95eb85198af4d46eff3cfc85443b46"} Mar 21 04:10:27 crc kubenswrapper[4685]: I0321 04:10:27.765430 4685 scope.go:117] "RemoveContainer" containerID="57423b2d9cf7ff8da6ba809bb6110bea3a95eb85198af4d46eff3cfc85443b46" Mar 21 04:10:27 crc kubenswrapper[4685]: I0321 04:10:27.847919 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-clkpq_must-gather-mfmmh_b741f897-9957-47db-b5ea-3687a5a12d53/gather/0.log" Mar 21 04:10:36 crc kubenswrapper[4685]: I0321 04:10:36.237499 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-clkpq/must-gather-mfmmh"] Mar 21 04:10:36 crc kubenswrapper[4685]: I0321 04:10:36.238405 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-clkpq/must-gather-mfmmh" podUID="b741f897-9957-47db-b5ea-3687a5a12d53" containerName="copy" containerID="cri-o://44f969cb14e9d45dbd3d5b768ea5acaa48fcc4353c525de599ba6dbe5caa1e7f" gracePeriod=2 Mar 21 04:10:36 crc kubenswrapper[4685]: I0321 04:10:36.244959 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-clkpq/must-gather-mfmmh"] Mar 21 04:10:36 crc kubenswrapper[4685]: I0321 04:10:36.650206 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-clkpq_must-gather-mfmmh_b741f897-9957-47db-b5ea-3687a5a12d53/copy/0.log" Mar 21 04:10:36 crc kubenswrapper[4685]: I0321 04:10:36.650884 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-clkpq/must-gather-mfmmh" Mar 21 04:10:36 crc kubenswrapper[4685]: I0321 04:10:36.725226 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8h8m\" (UniqueName: \"kubernetes.io/projected/b741f897-9957-47db-b5ea-3687a5a12d53-kube-api-access-f8h8m\") pod \"b741f897-9957-47db-b5ea-3687a5a12d53\" (UID: \"b741f897-9957-47db-b5ea-3687a5a12d53\") " Mar 21 04:10:36 crc kubenswrapper[4685]: I0321 04:10:36.725296 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b741f897-9957-47db-b5ea-3687a5a12d53-must-gather-output\") pod \"b741f897-9957-47db-b5ea-3687a5a12d53\" (UID: \"b741f897-9957-47db-b5ea-3687a5a12d53\") " Mar 21 04:10:36 crc kubenswrapper[4685]: I0321 04:10:36.730746 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b741f897-9957-47db-b5ea-3687a5a12d53-kube-api-access-f8h8m" (OuterVolumeSpecName: "kube-api-access-f8h8m") pod "b741f897-9957-47db-b5ea-3687a5a12d53" (UID: "b741f897-9957-47db-b5ea-3687a5a12d53"). InnerVolumeSpecName "kube-api-access-f8h8m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:10:36 crc kubenswrapper[4685]: I0321 04:10:36.782942 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b741f897-9957-47db-b5ea-3687a5a12d53-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "b741f897-9957-47db-b5ea-3687a5a12d53" (UID: "b741f897-9957-47db-b5ea-3687a5a12d53"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:10:36 crc kubenswrapper[4685]: I0321 04:10:36.826394 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8h8m\" (UniqueName: \"kubernetes.io/projected/b741f897-9957-47db-b5ea-3687a5a12d53-kube-api-access-f8h8m\") on node \"crc\" DevicePath \"\"" Mar 21 04:10:36 crc kubenswrapper[4685]: I0321 04:10:36.826428 4685 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b741f897-9957-47db-b5ea-3687a5a12d53-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 21 04:10:37 crc kubenswrapper[4685]: I0321 04:10:37.212247 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-clkpq_must-gather-mfmmh_b741f897-9957-47db-b5ea-3687a5a12d53/copy/0.log" Mar 21 04:10:37 crc kubenswrapper[4685]: I0321 04:10:37.212649 4685 generic.go:334] "Generic (PLEG): container finished" podID="b741f897-9957-47db-b5ea-3687a5a12d53" containerID="44f969cb14e9d45dbd3d5b768ea5acaa48fcc4353c525de599ba6dbe5caa1e7f" exitCode=143 Mar 21 04:10:37 crc kubenswrapper[4685]: I0321 04:10:37.212719 4685 scope.go:117] "RemoveContainer" containerID="44f969cb14e9d45dbd3d5b768ea5acaa48fcc4353c525de599ba6dbe5caa1e7f" Mar 21 04:10:37 crc kubenswrapper[4685]: I0321 04:10:37.212745 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-clkpq/must-gather-mfmmh" Mar 21 04:10:37 crc kubenswrapper[4685]: I0321 04:10:37.230790 4685 scope.go:117] "RemoveContainer" containerID="57423b2d9cf7ff8da6ba809bb6110bea3a95eb85198af4d46eff3cfc85443b46" Mar 21 04:10:37 crc kubenswrapper[4685]: I0321 04:10:37.275340 4685 scope.go:117] "RemoveContainer" containerID="44f969cb14e9d45dbd3d5b768ea5acaa48fcc4353c525de599ba6dbe5caa1e7f" Mar 21 04:10:37 crc kubenswrapper[4685]: E0321 04:10:37.276559 4685 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44f969cb14e9d45dbd3d5b768ea5acaa48fcc4353c525de599ba6dbe5caa1e7f\": container with ID starting with 44f969cb14e9d45dbd3d5b768ea5acaa48fcc4353c525de599ba6dbe5caa1e7f not found: ID does not exist" containerID="44f969cb14e9d45dbd3d5b768ea5acaa48fcc4353c525de599ba6dbe5caa1e7f" Mar 21 04:10:37 crc kubenswrapper[4685]: I0321 04:10:37.276651 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44f969cb14e9d45dbd3d5b768ea5acaa48fcc4353c525de599ba6dbe5caa1e7f"} err="failed to get container status \"44f969cb14e9d45dbd3d5b768ea5acaa48fcc4353c525de599ba6dbe5caa1e7f\": rpc error: code = NotFound desc = could not find container \"44f969cb14e9d45dbd3d5b768ea5acaa48fcc4353c525de599ba6dbe5caa1e7f\": container with ID starting with 44f969cb14e9d45dbd3d5b768ea5acaa48fcc4353c525de599ba6dbe5caa1e7f not found: ID does not exist" Mar 21 04:10:37 crc kubenswrapper[4685]: I0321 04:10:37.276735 4685 scope.go:117] "RemoveContainer" containerID="57423b2d9cf7ff8da6ba809bb6110bea3a95eb85198af4d46eff3cfc85443b46" Mar 21 04:10:37 crc kubenswrapper[4685]: E0321 04:10:37.277498 4685 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57423b2d9cf7ff8da6ba809bb6110bea3a95eb85198af4d46eff3cfc85443b46\": container with ID starting with 57423b2d9cf7ff8da6ba809bb6110bea3a95eb85198af4d46eff3cfc85443b46 not found: ID does not exist" containerID="57423b2d9cf7ff8da6ba809bb6110bea3a95eb85198af4d46eff3cfc85443b46" Mar 21 04:10:37 crc kubenswrapper[4685]: I0321 04:10:37.277534 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57423b2d9cf7ff8da6ba809bb6110bea3a95eb85198af4d46eff3cfc85443b46"} err="failed to get container status \"57423b2d9cf7ff8da6ba809bb6110bea3a95eb85198af4d46eff3cfc85443b46\": rpc error: code = NotFound desc = could not find container \"57423b2d9cf7ff8da6ba809bb6110bea3a95eb85198af4d46eff3cfc85443b46\": container with ID starting with 57423b2d9cf7ff8da6ba809bb6110bea3a95eb85198af4d46eff3cfc85443b46 not found: ID does not exist" Mar 21 04:10:38 crc kubenswrapper[4685]: I0321 04:10:38.311514 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b741f897-9957-47db-b5ea-3687a5a12d53" path="/var/lib/kubelet/pods/b741f897-9957-47db-b5ea-3687a5a12d53/volumes" Mar 21 04:10:39 crc kubenswrapper[4685]: I0321 04:10:39.148012 4685 scope.go:117] "RemoveContainer" containerID="ece686729f1617c9e2677d81696e23494e78fee60fdf82dc773918bae0d3fbf9" Mar 21 04:10:39 crc kubenswrapper[4685]: I0321 04:10:39.180034 4685 scope.go:117] "RemoveContainer" containerID="089f0026a8c686205a2ed8e0b9efec14c6e581acad02ab00af33a41dc56d895b" Mar 21 04:10:39 crc kubenswrapper[4685]: I0321 04:10:39.196149 4685 scope.go:117] "RemoveContainer" containerID="5c9ad915bb8485684b25dcbb01306ea349ffb95abbd38192e8ee720fef63b8c3" Mar 21 04:10:39 crc kubenswrapper[4685]: I0321 04:10:39.219535 4685 scope.go:117] "RemoveContainer" containerID="5f9b13843ce7cfe60bbca680a9a3b2a760a9118156fc8bc8be58f8433440e0f6" Mar 21 04:10:39 crc kubenswrapper[4685]: I0321 04:10:39.256064 4685 scope.go:117] "RemoveContainer" containerID="048b3767f6dddcdbfd89a1ee56c73e3f2bd43d828a6c717095fa7ae4777fd5b4" Mar 21 04:10:39 crc kubenswrapper[4685]: I0321 04:10:39.294635 4685 scope.go:117] "RemoveContainer" containerID="cf8e3abcb5fdb58d7e4723134a52f88708af10f1a7caa07e15a0ed125f689048" Mar 21 04:10:39 crc kubenswrapper[4685]: I0321 04:10:39.326328 4685 scope.go:117] "RemoveContainer" containerID="a5e78ba39e157c3f256ba210957b89e74ba910d6b58f14241d5fa9e214f88bd9" Mar 21 04:11:39 crc kubenswrapper[4685]: I0321 04:11:39.422402 4685 scope.go:117] "RemoveContainer" containerID="d60f5c559582dc45d7a37558330eb8efd4ecbb1054042c3729d4b402b10a4150" Mar 21 04:11:39 crc kubenswrapper[4685]: I0321 04:11:39.445068 4685 scope.go:117] "RemoveContainer" containerID="504e1e37d72164f3df29405c996af83bddc68957778611fcb847936ee9327586" Mar 21 04:11:39 crc kubenswrapper[4685]: I0321 04:11:39.471792 4685 scope.go:117] "RemoveContainer" containerID="488bfae4e304e9216c77c046db670be2391ea0f1783a9db9181b2197faca2483" Mar 21 04:11:39 crc kubenswrapper[4685]: I0321 04:11:39.513391 4685 scope.go:117] "RemoveContainer" containerID="d635250f0ee1eea21d79b1e0ca7e4f6e4057440a9515737b2f0614c19a9ce1fc" Mar 21 04:11:39 crc kubenswrapper[4685]: I0321 04:11:39.535361 4685 scope.go:117] "RemoveContainer" containerID="cbbd1a2ce1003d134cae1726505335a0eaf118ab38ccb1008f2e26952fb34710" Mar 21 04:12:00 crc kubenswrapper[4685]: I0321 04:12:00.158419 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567772-jtj9v"] Mar 21 04:12:00 crc kubenswrapper[4685]: E0321 04:12:00.159526 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75432731-b3b0-48bb-a0b9-6985398ddaf5" containerName="oc" Mar 21 04:12:00 crc kubenswrapper[4685]: I0321 04:12:00.159554 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="75432731-b3b0-48bb-a0b9-6985398ddaf5" containerName="oc" Mar 21 04:12:00 crc kubenswrapper[4685]: E0321 04:12:00.159589 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b741f897-9957-47db-b5ea-3687a5a12d53" containerName="copy" Mar 21 04:12:00 crc kubenswrapper[4685]: I0321 04:12:00.159605 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="b741f897-9957-47db-b5ea-3687a5a12d53" containerName="copy" Mar 21 04:12:00 crc kubenswrapper[4685]: E0321 04:12:00.159634 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b741f897-9957-47db-b5ea-3687a5a12d53" containerName="gather" Mar 21 04:12:00 crc kubenswrapper[4685]: I0321 04:12:00.159649 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="b741f897-9957-47db-b5ea-3687a5a12d53" containerName="gather" Mar 21 04:12:00 crc kubenswrapper[4685]: I0321 04:12:00.159928 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="b741f897-9957-47db-b5ea-3687a5a12d53" containerName="copy" Mar 21 04:12:00 crc kubenswrapper[4685]: I0321 04:12:00.159957 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="75432731-b3b0-48bb-a0b9-6985398ddaf5" containerName="oc" Mar 21 04:12:00 crc kubenswrapper[4685]: I0321 04:12:00.159978 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="b741f897-9957-47db-b5ea-3687a5a12d53" containerName="gather" Mar 21 04:12:00 crc kubenswrapper[4685]: I0321 04:12:00.160727 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567772-jtj9v" Mar 21 04:12:00 crc kubenswrapper[4685]: I0321 04:12:00.163977 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k75cc" Mar 21 04:12:00 crc kubenswrapper[4685]: I0321 04:12:00.164183 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 04:12:00 crc kubenswrapper[4685]: I0321 04:12:00.164352 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 04:12:00 crc kubenswrapper[4685]: I0321 04:12:00.175393 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567772-jtj9v"] Mar 21 04:12:00 crc kubenswrapper[4685]: I0321 04:12:00.276263 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hmrg\" (UniqueName: \"kubernetes.io/projected/d91ef770-3ed7-4de5-9707-53b8bcad00d0-kube-api-access-7hmrg\") pod \"auto-csr-approver-29567772-jtj9v\" (UID: \"d91ef770-3ed7-4de5-9707-53b8bcad00d0\") " pod="openshift-infra/auto-csr-approver-29567772-jtj9v" Mar 21 04:12:00 crc kubenswrapper[4685]: I0321 04:12:00.378036 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hmrg\" (UniqueName: \"kubernetes.io/projected/d91ef770-3ed7-4de5-9707-53b8bcad00d0-kube-api-access-7hmrg\") pod \"auto-csr-approver-29567772-jtj9v\" (UID: \"d91ef770-3ed7-4de5-9707-53b8bcad00d0\") " pod="openshift-infra/auto-csr-approver-29567772-jtj9v" Mar 21 04:12:00 crc kubenswrapper[4685]: I0321 04:12:00.409339 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hmrg\" (UniqueName: \"kubernetes.io/projected/d91ef770-3ed7-4de5-9707-53b8bcad00d0-kube-api-access-7hmrg\") pod \"auto-csr-approver-29567772-jtj9v\" (UID: \"d91ef770-3ed7-4de5-9707-53b8bcad00d0\") " pod="openshift-infra/auto-csr-approver-29567772-jtj9v" Mar 21 04:12:00 crc kubenswrapper[4685]: I0321 04:12:00.510274 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567772-jtj9v" Mar 21 04:12:00 crc kubenswrapper[4685]: I0321 04:12:00.913316 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567772-jtj9v"] Mar 21 04:12:00 crc kubenswrapper[4685]: W0321 04:12:00.920346 4685 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd91ef770_3ed7_4de5_9707_53b8bcad00d0.slice/crio-b1b9acb506f513d2f82bcf773ce9c50bed2ea8311fa5020bc43a6329fd62ff53 WatchSource:0}: Error finding container b1b9acb506f513d2f82bcf773ce9c50bed2ea8311fa5020bc43a6329fd62ff53: Status 404 returned error can't find the container with id b1b9acb506f513d2f82bcf773ce9c50bed2ea8311fa5020bc43a6329fd62ff53 Mar 21 04:12:01 crc kubenswrapper[4685]: I0321 04:12:01.870012 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567772-jtj9v" event={"ID":"d91ef770-3ed7-4de5-9707-53b8bcad00d0","Type":"ContainerStarted","Data":"b1b9acb506f513d2f82bcf773ce9c50bed2ea8311fa5020bc43a6329fd62ff53"} Mar 21 04:12:02 crc kubenswrapper[4685]: I0321 04:12:02.877502 4685 generic.go:334] "Generic (PLEG): container finished" podID="d91ef770-3ed7-4de5-9707-53b8bcad00d0" containerID="f59d28898c6e04272bb543ccb44c234b72d4d985db9747d660c85eabf5850b17" exitCode=0 Mar 21 04:12:02 crc kubenswrapper[4685]: I0321 04:12:02.877594 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567772-jtj9v" event={"ID":"d91ef770-3ed7-4de5-9707-53b8bcad00d0","Type":"ContainerDied","Data":"f59d28898c6e04272bb543ccb44c234b72d4d985db9747d660c85eabf5850b17"} Mar 21 04:12:04 crc kubenswrapper[4685]: I0321 04:12:04.188941 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567772-jtj9v" Mar 21 04:12:04 crc kubenswrapper[4685]: I0321 04:12:04.331565 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7hmrg\" (UniqueName: \"kubernetes.io/projected/d91ef770-3ed7-4de5-9707-53b8bcad00d0-kube-api-access-7hmrg\") pod \"d91ef770-3ed7-4de5-9707-53b8bcad00d0\" (UID: \"d91ef770-3ed7-4de5-9707-53b8bcad00d0\") " Mar 21 04:12:04 crc kubenswrapper[4685]: I0321 04:12:04.336623 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d91ef770-3ed7-4de5-9707-53b8bcad00d0-kube-api-access-7hmrg" (OuterVolumeSpecName: "kube-api-access-7hmrg") pod "d91ef770-3ed7-4de5-9707-53b8bcad00d0" (UID: "d91ef770-3ed7-4de5-9707-53b8bcad00d0"). InnerVolumeSpecName "kube-api-access-7hmrg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:12:04 crc kubenswrapper[4685]: I0321 04:12:04.432861 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7hmrg\" (UniqueName: \"kubernetes.io/projected/d91ef770-3ed7-4de5-9707-53b8bcad00d0-kube-api-access-7hmrg\") on node \"crc\" DevicePath \"\"" Mar 21 04:12:04 crc kubenswrapper[4685]: I0321 04:12:04.903028 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567772-jtj9v" event={"ID":"d91ef770-3ed7-4de5-9707-53b8bcad00d0","Type":"ContainerDied","Data":"b1b9acb506f513d2f82bcf773ce9c50bed2ea8311fa5020bc43a6329fd62ff53"} Mar 21 04:12:04 crc kubenswrapper[4685]: I0321 04:12:04.903094 4685 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b1b9acb506f513d2f82bcf773ce9c50bed2ea8311fa5020bc43a6329fd62ff53" Mar 21 04:12:04 crc kubenswrapper[4685]: I0321 04:12:04.903200 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567772-jtj9v" Mar 21 04:12:05 crc kubenswrapper[4685]: I0321 04:12:05.255089 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567766-lgcgr"] Mar 21 04:12:05 crc kubenswrapper[4685]: I0321 04:12:05.262026 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567766-lgcgr"] Mar 21 04:12:06 crc kubenswrapper[4685]: I0321 04:12:06.315660 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bbb4a07-790d-41ce-a9d0-40dd729bba6e" path="/var/lib/kubelet/pods/7bbb4a07-790d-41ce-a9d0-40dd729bba6e/volumes" Mar 21 04:12:09 crc kubenswrapper[4685]: I0321 04:12:09.685729 4685 patch_prober.go:28] interesting pod/machine-config-daemon-7r9cg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 04:12:09 crc kubenswrapper[4685]: I0321 04:12:09.686171 4685 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" podUID="cea46fe2-4e41-43ab-a069-cb30fb4e732c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 04:12:39 crc kubenswrapper[4685]: I0321 04:12:39.595610 4685 scope.go:117] "RemoveContainer" containerID="3635458b4890450e0c9674ed9954b74779c9f2f2247462d1d69aa7931c450202" Mar 21 04:12:39 crc kubenswrapper[4685]: I0321 04:12:39.619264 4685 scope.go:117] "RemoveContainer" containerID="5e9acff5a2e10e21a9d5026b7bb3068bf20c1f1220a53a874327c1e90be60013" Mar 21 04:12:39 crc kubenswrapper[4685]: I0321 04:12:39.666599 4685 scope.go:117] "RemoveContainer" containerID="ca6361bdc8f807ade02a12c7a69bc2039fd0386bea3e6cc678936a1e5479fdf0" Mar 21 04:12:39 crc kubenswrapper[4685]: I0321 04:12:39.684750 4685 scope.go:117] "RemoveContainer" containerID="535ba5465e6117f6b479f79db72bec5a9b06c354ca9ed94038324a9bcc9b8a4e" Mar 21 04:12:39 crc kubenswrapper[4685]: I0321 04:12:39.685499 4685 patch_prober.go:28] interesting pod/machine-config-daemon-7r9cg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 04:12:39 crc kubenswrapper[4685]: I0321 04:12:39.685573 4685 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" podUID="cea46fe2-4e41-43ab-a069-cb30fb4e732c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 04:12:39 crc kubenswrapper[4685]: I0321 04:12:39.704172 4685 scope.go:117] "RemoveContainer" containerID="bd97aea7a640f3a4b7ab19871ef72a957097598c0d7085456eef047123523924" Mar 21 04:12:39 crc kubenswrapper[4685]: I0321 04:12:39.725122 4685 scope.go:117] "RemoveContainer" containerID="a119c9da6267126198e8b7fd6119c4f5f090b32d83a339841698fe6376d4c2ac" Mar 21 04:12:39 crc kubenswrapper[4685]: I0321 04:12:39.738282 4685 scope.go:117] "RemoveContainer" containerID="0492d6fb42b0838ad25d70798ee623971214615668da21d10d7464027769c5a0" Mar 21 04:12:39 crc kubenswrapper[4685]: I0321 04:12:39.759343 4685 scope.go:117] "RemoveContainer" containerID="44e1b45e9ba3a97b24d7c15aabb381845fd30deefbd2c562aa820ba014672ccb" Mar 21 04:12:39 crc kubenswrapper[4685]: I0321 04:12:39.781096 4685 scope.go:117] "RemoveContainer" containerID="702916a75fb456f3b64f4dcfb2eb966ae9ae6617d4c1e515a270fff3fd4b6224" Mar 21 04:12:39 crc kubenswrapper[4685]: I0321 04:12:39.796749 4685 scope.go:117] "RemoveContainer" containerID="5a5861306173fdcbba6d612a677946fced4cd4857b9630c2deab3b98569c2b37" Mar 21 04:12:39 crc kubenswrapper[4685]: I0321 04:12:39.816509 4685 scope.go:117] "RemoveContainer" containerID="8101f771393f4477ed3b0386624ce42ba8be023cfc7d057e7b79e3f15820014f" Mar 21 04:13:09 crc kubenswrapper[4685]: I0321 04:13:09.685168 4685 patch_prober.go:28] interesting pod/machine-config-daemon-7r9cg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 04:13:09 crc kubenswrapper[4685]: I0321 04:13:09.685724 4685 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" podUID="cea46fe2-4e41-43ab-a069-cb30fb4e732c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 04:13:09 crc kubenswrapper[4685]: I0321 04:13:09.685769 4685 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" Mar 21 04:13:09 crc kubenswrapper[4685]: I0321 04:13:09.686335 4685 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d777c9ea0ed74e12c330732b76bb70566fa8eb4260f58ef17bc4882bafb2cbc6"} pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 21 04:13:09 crc kubenswrapper[4685]: I0321 04:13:09.686380 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" podUID="cea46fe2-4e41-43ab-a069-cb30fb4e732c" containerName="machine-config-daemon" containerID="cri-o://d777c9ea0ed74e12c330732b76bb70566fa8eb4260f58ef17bc4882bafb2cbc6" gracePeriod=600 Mar 21 04:13:09 crc kubenswrapper[4685]: E0321 04:13:09.807471 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7r9cg_openshift-machine-config-operator(cea46fe2-4e41-43ab-a069-cb30fb4e732c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" podUID="cea46fe2-4e41-43ab-a069-cb30fb4e732c" Mar 21 04:13:10 crc kubenswrapper[4685]: I0321 04:13:10.309460 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-pvlwj/must-gather-rv9gg"] Mar 21 04:13:10 crc kubenswrapper[4685]: E0321 04:13:10.309684 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d91ef770-3ed7-4de5-9707-53b8bcad00d0" containerName="oc" Mar 21 04:13:10 crc kubenswrapper[4685]: I0321 04:13:10.309699 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="d91ef770-3ed7-4de5-9707-53b8bcad00d0" containerName="oc" Mar 21 04:13:10 crc kubenswrapper[4685]: I0321 04:13:10.309862 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="d91ef770-3ed7-4de5-9707-53b8bcad00d0" containerName="oc" Mar 21 04:13:10 crc kubenswrapper[4685]: I0321 04:13:10.310522 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-pvlwj/must-gather-rv9gg"] Mar 21 04:13:10 crc kubenswrapper[4685]: I0321 04:13:10.310618 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pvlwj/must-gather-rv9gg" Mar 21 04:13:10 crc kubenswrapper[4685]: I0321 04:13:10.316028 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-pvlwj"/"openshift-service-ca.crt" Mar 21 04:13:10 crc kubenswrapper[4685]: I0321 04:13:10.316680 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-pvlwj"/"default-dockercfg-9lqbm" Mar 21 04:13:10 crc kubenswrapper[4685]: I0321 04:13:10.331220 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-pvlwj"/"kube-root-ca.crt" Mar 21 04:13:10 crc kubenswrapper[4685]: I0321 04:13:10.348053 4685 generic.go:334] "Generic (PLEG): container finished" podID="cea46fe2-4e41-43ab-a069-cb30fb4e732c" containerID="d777c9ea0ed74e12c330732b76bb70566fa8eb4260f58ef17bc4882bafb2cbc6" exitCode=0 Mar 21 04:13:10 crc kubenswrapper[4685]: I0321 04:13:10.348095 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" event={"ID":"cea46fe2-4e41-43ab-a069-cb30fb4e732c","Type":"ContainerDied","Data":"d777c9ea0ed74e12c330732b76bb70566fa8eb4260f58ef17bc4882bafb2cbc6"} Mar 21 04:13:10 crc kubenswrapper[4685]: I0321 04:13:10.348144 4685 scope.go:117] "RemoveContainer" containerID="855883c827cd8a38d55f90b9086e8832f325f74b077f5a71a8a2d2ad0a467f7f" Mar 21 04:13:10 crc kubenswrapper[4685]: I0321 04:13:10.348575 4685 scope.go:117] "RemoveContainer" containerID="d777c9ea0ed74e12c330732b76bb70566fa8eb4260f58ef17bc4882bafb2cbc6" Mar 21 04:13:10 crc kubenswrapper[4685]: E0321 04:13:10.348807 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7r9cg_openshift-machine-config-operator(cea46fe2-4e41-43ab-a069-cb30fb4e732c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" podUID="cea46fe2-4e41-43ab-a069-cb30fb4e732c" Mar 21 04:13:10 crc kubenswrapper[4685]: I0321 04:13:10.418090 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hv8z\" (UniqueName: \"kubernetes.io/projected/26eea1ac-1be6-405e-a606-dadf213577a2-kube-api-access-5hv8z\") pod \"must-gather-rv9gg\" (UID: \"26eea1ac-1be6-405e-a606-dadf213577a2\") " pod="openshift-must-gather-pvlwj/must-gather-rv9gg" Mar 21 04:13:10 crc kubenswrapper[4685]: I0321 04:13:10.418515 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/26eea1ac-1be6-405e-a606-dadf213577a2-must-gather-output\") pod \"must-gather-rv9gg\" (UID: \"26eea1ac-1be6-405e-a606-dadf213577a2\") " pod="openshift-must-gather-pvlwj/must-gather-rv9gg" Mar 21 04:13:10 crc kubenswrapper[4685]: I0321 04:13:10.519474 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hv8z\" (UniqueName: \"kubernetes.io/projected/26eea1ac-1be6-405e-a606-dadf213577a2-kube-api-access-5hv8z\") pod \"must-gather-rv9gg\" (UID: \"26eea1ac-1be6-405e-a606-dadf213577a2\") " pod="openshift-must-gather-pvlwj/must-gather-rv9gg" Mar 21 04:13:10 crc kubenswrapper[4685]: I0321 04:13:10.519555 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/26eea1ac-1be6-405e-a606-dadf213577a2-must-gather-output\") pod \"must-gather-rv9gg\" (UID: \"26eea1ac-1be6-405e-a606-dadf213577a2\") " pod="openshift-must-gather-pvlwj/must-gather-rv9gg" Mar 21 04:13:10 crc kubenswrapper[4685]: I0321 04:13:10.519988 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/26eea1ac-1be6-405e-a606-dadf213577a2-must-gather-output\") pod \"must-gather-rv9gg\" (UID: \"26eea1ac-1be6-405e-a606-dadf213577a2\") " pod="openshift-must-gather-pvlwj/must-gather-rv9gg" Mar 21 04:13:10 crc kubenswrapper[4685]: I0321 04:13:10.557582 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hv8z\" (UniqueName: \"kubernetes.io/projected/26eea1ac-1be6-405e-a606-dadf213577a2-kube-api-access-5hv8z\") pod \"must-gather-rv9gg\" (UID: \"26eea1ac-1be6-405e-a606-dadf213577a2\") " pod="openshift-must-gather-pvlwj/must-gather-rv9gg" Mar 21 04:13:10 crc kubenswrapper[4685]: I0321 04:13:10.628786 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pvlwj/must-gather-rv9gg" Mar 21 04:13:11 crc kubenswrapper[4685]: I0321 04:13:11.090756 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-pvlwj/must-gather-rv9gg"] Mar 21 04:13:11 crc kubenswrapper[4685]: I0321 04:13:11.356224 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pvlwj/must-gather-rv9gg" event={"ID":"26eea1ac-1be6-405e-a606-dadf213577a2","Type":"ContainerStarted","Data":"c55bc7fca4f251b216c79e669b7e9caad787358e25c805dd7819ae55069f924c"} Mar 21 04:13:11 crc kubenswrapper[4685]: I0321 04:13:11.356260 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pvlwj/must-gather-rv9gg" event={"ID":"26eea1ac-1be6-405e-a606-dadf213577a2","Type":"ContainerStarted","Data":"1b871323ae16c414e859664a72538092d19ded1d836a670d25c2ecd46485aa51"} Mar 21 04:13:12 crc kubenswrapper[4685]: I0321 04:13:12.370893 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pvlwj/must-gather-rv9gg" event={"ID":"26eea1ac-1be6-405e-a606-dadf213577a2","Type":"ContainerStarted","Data":"5716e822b3b1d2785bdd39e067f9486705adfdd1b2955898e487b8f5c1bcb811"} Mar 21 04:13:12 crc kubenswrapper[4685]: I0321 04:13:12.393665 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-pvlwj/must-gather-rv9gg" podStartSLOduration=2.393642884 podStartE2EDuration="2.393642884s" podCreationTimestamp="2026-03-21 04:13:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:13:12.38786334 +0000 UTC m=+1624.864932172" watchObservedRunningTime="2026-03-21 04:13:12.393642884 +0000 UTC m=+1624.870711676" Mar 21 04:13:15 crc kubenswrapper[4685]: I0321 04:13:15.114340 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ppqsw"] Mar 21 04:13:15 crc kubenswrapper[4685]: I0321 04:13:15.115673 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ppqsw" Mar 21 04:13:15 crc kubenswrapper[4685]: I0321 04:13:15.131864 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ppqsw"] Mar 21 04:13:15 crc kubenswrapper[4685]: I0321 04:13:15.175526 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de8a2604-c82d-4fef-9272-ec52fa958b3a-catalog-content\") pod \"redhat-operators-ppqsw\" (UID: \"de8a2604-c82d-4fef-9272-ec52fa958b3a\") " pod="openshift-marketplace/redhat-operators-ppqsw" Mar 21 04:13:15 crc kubenswrapper[4685]: I0321 04:13:15.175586 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de8a2604-c82d-4fef-9272-ec52fa958b3a-utilities\") pod \"redhat-operators-ppqsw\" (UID: \"de8a2604-c82d-4fef-9272-ec52fa958b3a\") " pod="openshift-marketplace/redhat-operators-ppqsw" Mar 21 04:13:15 crc kubenswrapper[4685]: I0321 04:13:15.175665 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ds25g\" (UniqueName: \"kubernetes.io/projected/de8a2604-c82d-4fef-9272-ec52fa958b3a-kube-api-access-ds25g\") pod \"redhat-operators-ppqsw\" (UID: \"de8a2604-c82d-4fef-9272-ec52fa958b3a\") " pod="openshift-marketplace/redhat-operators-ppqsw" Mar 21 04:13:15 crc kubenswrapper[4685]: I0321 04:13:15.276571 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ds25g\" (UniqueName: \"kubernetes.io/projected/de8a2604-c82d-4fef-9272-ec52fa958b3a-kube-api-access-ds25g\") pod \"redhat-operators-ppqsw\" (UID: \"de8a2604-c82d-4fef-9272-ec52fa958b3a\") " pod="openshift-marketplace/redhat-operators-ppqsw" Mar 21 04:13:15 crc kubenswrapper[4685]: I0321 04:13:15.276647 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de8a2604-c82d-4fef-9272-ec52fa958b3a-catalog-content\") pod \"redhat-operators-ppqsw\" (UID: \"de8a2604-c82d-4fef-9272-ec52fa958b3a\") " pod="openshift-marketplace/redhat-operators-ppqsw" Mar 21 04:13:15 crc kubenswrapper[4685]: I0321 04:13:15.276684 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de8a2604-c82d-4fef-9272-ec52fa958b3a-utilities\") pod \"redhat-operators-ppqsw\" (UID: \"de8a2604-c82d-4fef-9272-ec52fa958b3a\") " pod="openshift-marketplace/redhat-operators-ppqsw" Mar 21 04:13:15 crc kubenswrapper[4685]: I0321 04:13:15.277324 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de8a2604-c82d-4fef-9272-ec52fa958b3a-catalog-content\") pod \"redhat-operators-ppqsw\" (UID: \"de8a2604-c82d-4fef-9272-ec52fa958b3a\") " pod="openshift-marketplace/redhat-operators-ppqsw" Mar 21 04:13:15 crc kubenswrapper[4685]: I0321 04:13:15.277398 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de8a2604-c82d-4fef-9272-ec52fa958b3a-utilities\") pod \"redhat-operators-ppqsw\" (UID: \"de8a2604-c82d-4fef-9272-ec52fa958b3a\") " pod="openshift-marketplace/redhat-operators-ppqsw" Mar 21 04:13:15 crc kubenswrapper[4685]: I0321 04:13:15.296501 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ds25g\" (UniqueName: \"kubernetes.io/projected/de8a2604-c82d-4fef-9272-ec52fa958b3a-kube-api-access-ds25g\") pod \"redhat-operators-ppqsw\" (UID: \"de8a2604-c82d-4fef-9272-ec52fa958b3a\") " pod="openshift-marketplace/redhat-operators-ppqsw" Mar 21 04:13:15 crc kubenswrapper[4685]: I0321 04:13:15.433455 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ppqsw" Mar 21 04:13:15 crc kubenswrapper[4685]: W0321 04:13:15.762519 4685 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde8a2604_c82d_4fef_9272_ec52fa958b3a.slice/crio-8f15cd78170f01ba0c8594304dd0fcbc733af3c8a4fbfb6bf79f66e40c330861 WatchSource:0}: Error finding container 8f15cd78170f01ba0c8594304dd0fcbc733af3c8a4fbfb6bf79f66e40c330861: Status 404 returned error can't find the container with id 8f15cd78170f01ba0c8594304dd0fcbc733af3c8a4fbfb6bf79f66e40c330861 Mar 21 04:13:15 crc kubenswrapper[4685]: I0321 04:13:15.770310 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ppqsw"] Mar 21 04:13:16 crc kubenswrapper[4685]: I0321 04:13:16.394454 4685 generic.go:334] "Generic (PLEG): container finished" podID="de8a2604-c82d-4fef-9272-ec52fa958b3a" containerID="175a58c2f42b929c21c4dd0d941f90415dd9b9e867c30e156325e7fa436ce790" exitCode=0 Mar 21 04:13:16 crc kubenswrapper[4685]: I0321 04:13:16.394530 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ppqsw" event={"ID":"de8a2604-c82d-4fef-9272-ec52fa958b3a","Type":"ContainerDied","Data":"175a58c2f42b929c21c4dd0d941f90415dd9b9e867c30e156325e7fa436ce790"} Mar 21 04:13:16 crc kubenswrapper[4685]: I0321 04:13:16.394726 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ppqsw" event={"ID":"de8a2604-c82d-4fef-9272-ec52fa958b3a","Type":"ContainerStarted","Data":"8f15cd78170f01ba0c8594304dd0fcbc733af3c8a4fbfb6bf79f66e40c330861"} Mar 21 04:13:16 crc kubenswrapper[4685]: I0321 04:13:16.396397 4685 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 21 04:13:18 crc kubenswrapper[4685]: I0321 04:13:18.409711 4685 generic.go:334] "Generic (PLEG): container finished" podID="de8a2604-c82d-4fef-9272-ec52fa958b3a" containerID="4363fcd5754dc0d4adf0a77c1adfa9bcd7bb7007fb04429d8e548f31e491ba33" exitCode=0 Mar 21 04:13:18 crc kubenswrapper[4685]: I0321 04:13:18.409786 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ppqsw" event={"ID":"de8a2604-c82d-4fef-9272-ec52fa958b3a","Type":"ContainerDied","Data":"4363fcd5754dc0d4adf0a77c1adfa9bcd7bb7007fb04429d8e548f31e491ba33"} Mar 21 04:13:19 crc kubenswrapper[4685]: I0321 04:13:19.418910 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ppqsw" event={"ID":"de8a2604-c82d-4fef-9272-ec52fa958b3a","Type":"ContainerStarted","Data":"eb3ef2c007c010fa7ea4351e463bd5821322d0cc344e1c7b4a58b517300b42aa"} Mar 21 04:13:19 crc kubenswrapper[4685]: I0321 04:13:19.452666 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ppqsw" podStartSLOduration=1.685314789 podStartE2EDuration="4.452649999s" podCreationTimestamp="2026-03-21 04:13:15 +0000 UTC" firstStartedPulling="2026-03-21 04:13:16.396107935 +0000 UTC m=+1628.873176727" lastFinishedPulling="2026-03-21 04:13:19.163443145 +0000 UTC m=+1631.640511937" observedRunningTime="2026-03-21 04:13:19.449263492 +0000 UTC m=+1631.926332304" watchObservedRunningTime="2026-03-21 04:13:19.452649999 +0000 UTC m=+1631.929718791" Mar 21 04:13:24 crc kubenswrapper[4685]: I0321 04:13:24.300841 4685 scope.go:117] "RemoveContainer" containerID="d777c9ea0ed74e12c330732b76bb70566fa8eb4260f58ef17bc4882bafb2cbc6" Mar 21 04:13:24 crc kubenswrapper[4685]: E0321 04:13:24.301660 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7r9cg_openshift-machine-config-operator(cea46fe2-4e41-43ab-a069-cb30fb4e732c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" podUID="cea46fe2-4e41-43ab-a069-cb30fb4e732c" Mar 21 04:13:25 crc kubenswrapper[4685]: I0321 04:13:25.433598 4685 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ppqsw" Mar 21 04:13:25 crc kubenswrapper[4685]: I0321 04:13:25.433675 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ppqsw" Mar 21 04:13:25 crc kubenswrapper[4685]: I0321 04:13:25.491298 4685 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ppqsw" Mar 21 04:13:25 crc kubenswrapper[4685]: I0321 04:13:25.531637 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ppqsw" Mar 21 04:13:25 crc kubenswrapper[4685]: I0321 04:13:25.773652 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ppqsw"] Mar 21 04:13:27 crc kubenswrapper[4685]: I0321 04:13:27.459746 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ppqsw" podUID="de8a2604-c82d-4fef-9272-ec52fa958b3a" containerName="registry-server" containerID="cri-o://eb3ef2c007c010fa7ea4351e463bd5821322d0cc344e1c7b4a58b517300b42aa" gracePeriod=2 Mar 21 04:13:27 crc kubenswrapper[4685]: I0321 04:13:27.821753 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ppqsw" Mar 21 04:13:27 crc kubenswrapper[4685]: I0321 04:13:27.940463 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de8a2604-c82d-4fef-9272-ec52fa958b3a-utilities\") pod \"de8a2604-c82d-4fef-9272-ec52fa958b3a\" (UID: \"de8a2604-c82d-4fef-9272-ec52fa958b3a\") " Mar 21 04:13:27 crc kubenswrapper[4685]: I0321 04:13:27.940853 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de8a2604-c82d-4fef-9272-ec52fa958b3a-catalog-content\") pod \"de8a2604-c82d-4fef-9272-ec52fa958b3a\" (UID: \"de8a2604-c82d-4fef-9272-ec52fa958b3a\") " Mar 21 04:13:27 crc kubenswrapper[4685]: I0321 04:13:27.940921 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ds25g\" (UniqueName: \"kubernetes.io/projected/de8a2604-c82d-4fef-9272-ec52fa958b3a-kube-api-access-ds25g\") pod \"de8a2604-c82d-4fef-9272-ec52fa958b3a\" (UID: \"de8a2604-c82d-4fef-9272-ec52fa958b3a\") " Mar 21 04:13:27 crc kubenswrapper[4685]: I0321 04:13:27.941883 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de8a2604-c82d-4fef-9272-ec52fa958b3a-utilities" (OuterVolumeSpecName: "utilities") pod "de8a2604-c82d-4fef-9272-ec52fa958b3a" (UID: "de8a2604-c82d-4fef-9272-ec52fa958b3a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:13:27 crc kubenswrapper[4685]: I0321 04:13:27.956263 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de8a2604-c82d-4fef-9272-ec52fa958b3a-kube-api-access-ds25g" (OuterVolumeSpecName: "kube-api-access-ds25g") pod "de8a2604-c82d-4fef-9272-ec52fa958b3a" (UID: "de8a2604-c82d-4fef-9272-ec52fa958b3a"). InnerVolumeSpecName "kube-api-access-ds25g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:13:28 crc kubenswrapper[4685]: I0321 04:13:28.042407 4685 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de8a2604-c82d-4fef-9272-ec52fa958b3a-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 04:13:28 crc kubenswrapper[4685]: I0321 04:13:28.042439 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ds25g\" (UniqueName: \"kubernetes.io/projected/de8a2604-c82d-4fef-9272-ec52fa958b3a-kube-api-access-ds25g\") on node \"crc\" DevicePath \"\"" Mar 21 04:13:28 crc kubenswrapper[4685]: I0321 04:13:28.467096 4685 generic.go:334] "Generic (PLEG): container finished" podID="de8a2604-c82d-4fef-9272-ec52fa958b3a" containerID="eb3ef2c007c010fa7ea4351e463bd5821322d0cc344e1c7b4a58b517300b42aa" exitCode=0 Mar 21 04:13:28 crc kubenswrapper[4685]: I0321 04:13:28.467143 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ppqsw" event={"ID":"de8a2604-c82d-4fef-9272-ec52fa958b3a","Type":"ContainerDied","Data":"eb3ef2c007c010fa7ea4351e463bd5821322d0cc344e1c7b4a58b517300b42aa"} Mar 21 04:13:28 crc kubenswrapper[4685]: I0321 04:13:28.467171 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ppqsw" event={"ID":"de8a2604-c82d-4fef-9272-ec52fa958b3a","Type":"ContainerDied","Data":"8f15cd78170f01ba0c8594304dd0fcbc733af3c8a4fbfb6bf79f66e40c330861"} Mar 21 04:13:28 crc kubenswrapper[4685]: I0321 04:13:28.467193 4685 scope.go:117] "RemoveContainer" containerID="eb3ef2c007c010fa7ea4351e463bd5821322d0cc344e1c7b4a58b517300b42aa" Mar 21 04:13:28 crc kubenswrapper[4685]: I0321 04:13:28.467193 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ppqsw" Mar 21 04:13:28 crc kubenswrapper[4685]: I0321 04:13:28.487395 4685 scope.go:117] "RemoveContainer" containerID="4363fcd5754dc0d4adf0a77c1adfa9bcd7bb7007fb04429d8e548f31e491ba33" Mar 21 04:13:28 crc kubenswrapper[4685]: I0321 04:13:28.506401 4685 scope.go:117] "RemoveContainer" containerID="175a58c2f42b929c21c4dd0d941f90415dd9b9e867c30e156325e7fa436ce790" Mar 21 04:13:28 crc kubenswrapper[4685]: I0321 04:13:28.521034 4685 scope.go:117] "RemoveContainer" containerID="eb3ef2c007c010fa7ea4351e463bd5821322d0cc344e1c7b4a58b517300b42aa" Mar 21 04:13:28 crc kubenswrapper[4685]: E0321 04:13:28.521625 4685 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb3ef2c007c010fa7ea4351e463bd5821322d0cc344e1c7b4a58b517300b42aa\": container with ID starting with eb3ef2c007c010fa7ea4351e463bd5821322d0cc344e1c7b4a58b517300b42aa not found: ID does not exist" containerID="eb3ef2c007c010fa7ea4351e463bd5821322d0cc344e1c7b4a58b517300b42aa" Mar 21 04:13:28 crc kubenswrapper[4685]: I0321 04:13:28.521667 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb3ef2c007c010fa7ea4351e463bd5821322d0cc344e1c7b4a58b517300b42aa"} err="failed to get container status \"eb3ef2c007c010fa7ea4351e463bd5821322d0cc344e1c7b4a58b517300b42aa\": rpc error: code = NotFound desc = could not find container \"eb3ef2c007c010fa7ea4351e463bd5821322d0cc344e1c7b4a58b517300b42aa\": container with ID starting with eb3ef2c007c010fa7ea4351e463bd5821322d0cc344e1c7b4a58b517300b42aa not found: ID does not exist" Mar 21 04:13:28 crc kubenswrapper[4685]: I0321 04:13:28.521693 4685 scope.go:117] "RemoveContainer" containerID="4363fcd5754dc0d4adf0a77c1adfa9bcd7bb7007fb04429d8e548f31e491ba33" Mar 21 04:13:28 crc kubenswrapper[4685]: E0321 04:13:28.522216 4685 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4363fcd5754dc0d4adf0a77c1adfa9bcd7bb7007fb04429d8e548f31e491ba33\": container with ID starting with 4363fcd5754dc0d4adf0a77c1adfa9bcd7bb7007fb04429d8e548f31e491ba33 not found: ID does not exist" containerID="4363fcd5754dc0d4adf0a77c1adfa9bcd7bb7007fb04429d8e548f31e491ba33" Mar 21 04:13:28 crc kubenswrapper[4685]: I0321 04:13:28.522252 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4363fcd5754dc0d4adf0a77c1adfa9bcd7bb7007fb04429d8e548f31e491ba33"} err="failed to get container status \"4363fcd5754dc0d4adf0a77c1adfa9bcd7bb7007fb04429d8e548f31e491ba33\": rpc error: code = NotFound desc = could not find container \"4363fcd5754dc0d4adf0a77c1adfa9bcd7bb7007fb04429d8e548f31e491ba33\": container with ID starting with 4363fcd5754dc0d4adf0a77c1adfa9bcd7bb7007fb04429d8e548f31e491ba33 not found: ID does not exist" Mar 21 04:13:28 crc kubenswrapper[4685]: I0321 04:13:28.522278 4685 scope.go:117] "RemoveContainer" containerID="175a58c2f42b929c21c4dd0d941f90415dd9b9e867c30e156325e7fa436ce790" Mar 21 04:13:28 crc kubenswrapper[4685]: E0321 04:13:28.522610 4685 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"175a58c2f42b929c21c4dd0d941f90415dd9b9e867c30e156325e7fa436ce790\": container with ID starting with 175a58c2f42b929c21c4dd0d941f90415dd9b9e867c30e156325e7fa436ce790 not found: ID does not exist" containerID="175a58c2f42b929c21c4dd0d941f90415dd9b9e867c30e156325e7fa436ce790" Mar 21 04:13:28 crc kubenswrapper[4685]: I0321 04:13:28.522637 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"175a58c2f42b929c21c4dd0d941f90415dd9b9e867c30e156325e7fa436ce790"} err="failed to get container status \"175a58c2f42b929c21c4dd0d941f90415dd9b9e867c30e156325e7fa436ce790\": rpc error: code = NotFound desc = could not find container \"175a58c2f42b929c21c4dd0d941f90415dd9b9e867c30e156325e7fa436ce790\": container with ID starting with 175a58c2f42b929c21c4dd0d941f90415dd9b9e867c30e156325e7fa436ce790 not found: ID does not exist" Mar 21 04:13:30 crc kubenswrapper[4685]: I0321 04:13:30.222048 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de8a2604-c82d-4fef-9272-ec52fa958b3a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "de8a2604-c82d-4fef-9272-ec52fa958b3a" (UID: "de8a2604-c82d-4fef-9272-ec52fa958b3a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:13:30 crc kubenswrapper[4685]: I0321 04:13:30.274683 4685 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de8a2604-c82d-4fef-9272-ec52fa958b3a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 04:13:30 crc kubenswrapper[4685]: I0321 04:13:30.294495 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ppqsw"] Mar 21 04:13:30 crc kubenswrapper[4685]: I0321 04:13:30.300126 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ppqsw"] Mar 21 04:13:30 crc kubenswrapper[4685]: I0321 04:13:30.318778 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de8a2604-c82d-4fef-9272-ec52fa958b3a" path="/var/lib/kubelet/pods/de8a2604-c82d-4fef-9272-ec52fa958b3a/volumes" Mar 21 04:13:35 crc kubenswrapper[4685]: I0321 04:13:35.301595 4685 scope.go:117] "RemoveContainer" containerID="d777c9ea0ed74e12c330732b76bb70566fa8eb4260f58ef17bc4882bafb2cbc6" Mar 21 04:13:35 crc kubenswrapper[4685]: E0321 04:13:35.302234 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7r9cg_openshift-machine-config-operator(cea46fe2-4e41-43ab-a069-cb30fb4e732c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" podUID="cea46fe2-4e41-43ab-a069-cb30fb4e732c" Mar 21 04:13:46 crc kubenswrapper[4685]: I0321 04:13:46.300985 4685 scope.go:117] "RemoveContainer" containerID="d777c9ea0ed74e12c330732b76bb70566fa8eb4260f58ef17bc4882bafb2cbc6" Mar 21 04:13:46 crc kubenswrapper[4685]: E0321 04:13:46.301635 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7r9cg_openshift-machine-config-operator(cea46fe2-4e41-43ab-a069-cb30fb4e732c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" podUID="cea46fe2-4e41-43ab-a069-cb30fb4e732c" Mar 21 04:13:57 crc kubenswrapper[4685]: I0321 04:13:57.300881 4685 scope.go:117] "RemoveContainer" containerID="d777c9ea0ed74e12c330732b76bb70566fa8eb4260f58ef17bc4882bafb2cbc6" Mar 21 04:13:57 crc kubenswrapper[4685]: E0321 04:13:57.301521 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7r9cg_openshift-machine-config-operator(cea46fe2-4e41-43ab-a069-cb30fb4e732c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" podUID="cea46fe2-4e41-43ab-a069-cb30fb4e732c" Mar 21 04:13:59 crc kubenswrapper[4685]: I0321 04:13:59.328728 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-646sv_3f819915-64a0-4327-ac01-5ff842cbc592/control-plane-machine-set-operator/0.log" Mar 21 04:13:59 crc kubenswrapper[4685]: I0321 04:13:59.497468 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-flxks_1ee95c71-cb75-4357-aeff-c0417a0c6eb3/kube-rbac-proxy/0.log" Mar 21 04:13:59 crc kubenswrapper[4685]: I0321 04:13:59.519015 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-flxks_1ee95c71-cb75-4357-aeff-c0417a0c6eb3/machine-api-operator/0.log" Mar 21 04:14:00 crc kubenswrapper[4685]: I0321 04:14:00.136712 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567774-pnl9j"] Mar 21 04:14:00 crc kubenswrapper[4685]: E0321 04:14:00.136945 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de8a2604-c82d-4fef-9272-ec52fa958b3a" containerName="registry-server" Mar 21 04:14:00 crc kubenswrapper[4685]: I0321 04:14:00.136957 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="de8a2604-c82d-4fef-9272-ec52fa958b3a" containerName="registry-server" Mar 21 04:14:00 crc kubenswrapper[4685]: E0321 04:14:00.136968 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de8a2604-c82d-4fef-9272-ec52fa958b3a" containerName="extract-content" Mar 21 04:14:00 crc kubenswrapper[4685]: I0321 04:14:00.136975 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="de8a2604-c82d-4fef-9272-ec52fa958b3a" containerName="extract-content" Mar 21 04:14:00 crc kubenswrapper[4685]: E0321 04:14:00.136990 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de8a2604-c82d-4fef-9272-ec52fa958b3a" containerName="extract-utilities" Mar 21 04:14:00 crc kubenswrapper[4685]: I0321 04:14:00.136997 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="de8a2604-c82d-4fef-9272-ec52fa958b3a" containerName="extract-utilities" Mar 21 04:14:00 crc kubenswrapper[4685]: I0321 04:14:00.137094 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="de8a2604-c82d-4fef-9272-ec52fa958b3a" containerName="registry-server" Mar 21 04:14:00 crc kubenswrapper[4685]: I0321 04:14:00.137455 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567774-pnl9j" Mar 21 04:14:00 crc kubenswrapper[4685]: I0321 04:14:00.139197 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k75cc" Mar 21 04:14:00 crc kubenswrapper[4685]: I0321 04:14:00.144663 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 04:14:00 crc kubenswrapper[4685]: I0321 04:14:00.149030 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 04:14:00 crc kubenswrapper[4685]: I0321 04:14:00.152131 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567774-pnl9j"] Mar 21 04:14:00 crc kubenswrapper[4685]: I0321 04:14:00.241406 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbwjt\" (UniqueName: \"kubernetes.io/projected/7ba66306-04db-4c93-980e-680dd8410a44-kube-api-access-cbwjt\") pod \"auto-csr-approver-29567774-pnl9j\" (UID: \"7ba66306-04db-4c93-980e-680dd8410a44\") " pod="openshift-infra/auto-csr-approver-29567774-pnl9j" Mar 21 04:14:00 crc kubenswrapper[4685]: I0321 04:14:00.342416 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbwjt\" (UniqueName: \"kubernetes.io/projected/7ba66306-04db-4c93-980e-680dd8410a44-kube-api-access-cbwjt\") pod \"auto-csr-approver-29567774-pnl9j\" (UID: \"7ba66306-04db-4c93-980e-680dd8410a44\") " pod="openshift-infra/auto-csr-approver-29567774-pnl9j" Mar 21 04:14:00 crc kubenswrapper[4685]: I0321 04:14:00.368556 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbwjt\" (UniqueName: \"kubernetes.io/projected/7ba66306-04db-4c93-980e-680dd8410a44-kube-api-access-cbwjt\") pod \"auto-csr-approver-29567774-pnl9j\" (UID: \"7ba66306-04db-4c93-980e-680dd8410a44\") " pod="openshift-infra/auto-csr-approver-29567774-pnl9j" Mar 21 04:14:00 crc kubenswrapper[4685]: I0321 04:14:00.453298 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567774-pnl9j" Mar 21 04:14:00 crc kubenswrapper[4685]: I0321 04:14:00.861346 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567774-pnl9j"] Mar 21 04:14:01 crc kubenswrapper[4685]: I0321 04:14:01.661977 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567774-pnl9j" event={"ID":"7ba66306-04db-4c93-980e-680dd8410a44","Type":"ContainerStarted","Data":"d4cb0bc31e7b498cb532d3e022a2deaa24f872b48ae72d9d4796b26a718a706a"} Mar 21 04:14:02 crc kubenswrapper[4685]: I0321 04:14:02.670182 4685 generic.go:334] "Generic (PLEG): container finished" podID="7ba66306-04db-4c93-980e-680dd8410a44" containerID="700b498d68ebba4ca8448b18d640eb5cadd818dfea0fffe88ed8c8bcd3457240" exitCode=0 Mar 21 04:14:02 crc kubenswrapper[4685]: I0321 04:14:02.670283 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567774-pnl9j" event={"ID":"7ba66306-04db-4c93-980e-680dd8410a44","Type":"ContainerDied","Data":"700b498d68ebba4ca8448b18d640eb5cadd818dfea0fffe88ed8c8bcd3457240"} Mar 21 04:14:03 crc kubenswrapper[4685]: I0321 04:14:03.890332 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567774-pnl9j" Mar 21 04:14:03 crc kubenswrapper[4685]: I0321 04:14:03.996848 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbwjt\" (UniqueName: \"kubernetes.io/projected/7ba66306-04db-4c93-980e-680dd8410a44-kube-api-access-cbwjt\") pod \"7ba66306-04db-4c93-980e-680dd8410a44\" (UID: \"7ba66306-04db-4c93-980e-680dd8410a44\") " Mar 21 04:14:04 crc kubenswrapper[4685]: I0321 04:14:04.008774 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ba66306-04db-4c93-980e-680dd8410a44-kube-api-access-cbwjt" (OuterVolumeSpecName: "kube-api-access-cbwjt") pod "7ba66306-04db-4c93-980e-680dd8410a44" (UID: "7ba66306-04db-4c93-980e-680dd8410a44"). InnerVolumeSpecName "kube-api-access-cbwjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:14:04 crc kubenswrapper[4685]: I0321 04:14:04.098658 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbwjt\" (UniqueName: \"kubernetes.io/projected/7ba66306-04db-4c93-980e-680dd8410a44-kube-api-access-cbwjt\") on node \"crc\" DevicePath \"\"" Mar 21 04:14:04 crc kubenswrapper[4685]: I0321 04:14:04.681450 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567774-pnl9j" event={"ID":"7ba66306-04db-4c93-980e-680dd8410a44","Type":"ContainerDied","Data":"d4cb0bc31e7b498cb532d3e022a2deaa24f872b48ae72d9d4796b26a718a706a"} Mar 21 04:14:04 crc kubenswrapper[4685]: I0321 04:14:04.681484 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567774-pnl9j" Mar 21 04:14:04 crc kubenswrapper[4685]: I0321 04:14:04.681493 4685 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4cb0bc31e7b498cb532d3e022a2deaa24f872b48ae72d9d4796b26a718a706a" Mar 21 04:14:04 crc kubenswrapper[4685]: I0321 04:14:04.949225 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567768-z5dsj"] Mar 21 04:14:04 crc kubenswrapper[4685]: I0321 04:14:04.953953 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567768-z5dsj"] Mar 21 04:14:06 crc kubenswrapper[4685]: I0321 04:14:06.306798 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eaa4eac7-8489-46a3-a666-7794ef9be68d" path="/var/lib/kubelet/pods/eaa4eac7-8489-46a3-a666-7794ef9be68d/volumes" Mar 21 04:14:09 crc kubenswrapper[4685]: I0321 04:14:09.301015 4685 scope.go:117] "RemoveContainer" containerID="d777c9ea0ed74e12c330732b76bb70566fa8eb4260f58ef17bc4882bafb2cbc6" Mar 21 04:14:09 crc kubenswrapper[4685]: E0321 04:14:09.302068 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7r9cg_openshift-machine-config-operator(cea46fe2-4e41-43ab-a069-cb30fb4e732c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" podUID="cea46fe2-4e41-43ab-a069-cb30fb4e732c" Mar 21 04:14:21 crc kubenswrapper[4685]: I0321 04:14:21.300588 4685 scope.go:117] "RemoveContainer" containerID="d777c9ea0ed74e12c330732b76bb70566fa8eb4260f58ef17bc4882bafb2cbc6" Mar 21 04:14:21 crc kubenswrapper[4685]: E0321 04:14:21.301424 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7r9cg_openshift-machine-config-operator(cea46fe2-4e41-43ab-a069-cb30fb4e732c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" podUID="cea46fe2-4e41-43ab-a069-cb30fb4e732c" Mar 21 04:14:26 crc kubenswrapper[4685]: I0321 04:14:26.145272 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-w4n8h_76cca999-b151-46c4-b61b-b6249d75e2f5/kube-rbac-proxy/0.log" Mar 21 04:14:26 crc kubenswrapper[4685]: I0321 04:14:26.190507 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-w4n8h_76cca999-b151-46c4-b61b-b6249d75e2f5/controller/0.log" Mar 21 04:14:26 crc kubenswrapper[4685]: I0321 04:14:26.325228 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tsjrs_7e226197-b1ce-49a2-a3b9-5aed3d774a12/cp-frr-files/0.log" Mar 21 04:14:26 crc kubenswrapper[4685]: I0321 04:14:26.523599 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tsjrs_7e226197-b1ce-49a2-a3b9-5aed3d774a12/cp-metrics/0.log" Mar 21 04:14:26 crc kubenswrapper[4685]: I0321 04:14:26.534586 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tsjrs_7e226197-b1ce-49a2-a3b9-5aed3d774a12/cp-frr-files/0.log" Mar 21 04:14:26 crc kubenswrapper[4685]: I0321 04:14:26.540596 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tsjrs_7e226197-b1ce-49a2-a3b9-5aed3d774a12/cp-reloader/0.log" Mar 21 04:14:26 crc kubenswrapper[4685]: I0321 04:14:26.554315 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tsjrs_7e226197-b1ce-49a2-a3b9-5aed3d774a12/cp-reloader/0.log" Mar 21 04:14:26 crc kubenswrapper[4685]: I0321 04:14:26.715341 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tsjrs_7e226197-b1ce-49a2-a3b9-5aed3d774a12/cp-metrics/0.log" Mar 21 04:14:26 crc kubenswrapper[4685]: I0321 04:14:26.726240 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tsjrs_7e226197-b1ce-49a2-a3b9-5aed3d774a12/cp-frr-files/0.log" Mar 21 04:14:26 crc kubenswrapper[4685]: I0321 04:14:26.739934 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tsjrs_7e226197-b1ce-49a2-a3b9-5aed3d774a12/cp-reloader/0.log" Mar 21 04:14:26 crc kubenswrapper[4685]: I0321 04:14:26.758316 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tsjrs_7e226197-b1ce-49a2-a3b9-5aed3d774a12/cp-metrics/0.log" Mar 21 04:14:26 crc kubenswrapper[4685]: I0321 04:14:26.913196 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tsjrs_7e226197-b1ce-49a2-a3b9-5aed3d774a12/cp-reloader/0.log" Mar 21 04:14:26 crc kubenswrapper[4685]: I0321 04:14:26.924037 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tsjrs_7e226197-b1ce-49a2-a3b9-5aed3d774a12/cp-metrics/0.log" Mar 21 04:14:26 crc kubenswrapper[4685]: I0321 04:14:26.927036 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tsjrs_7e226197-b1ce-49a2-a3b9-5aed3d774a12/controller/0.log" Mar 21 04:14:26 crc kubenswrapper[4685]: I0321 04:14:26.950370 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tsjrs_7e226197-b1ce-49a2-a3b9-5aed3d774a12/cp-frr-files/0.log" Mar 21 04:14:27 crc kubenswrapper[4685]: I0321 04:14:27.080329 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tsjrs_7e226197-b1ce-49a2-a3b9-5aed3d774a12/frr-metrics/0.log" Mar 21 04:14:27 crc kubenswrapper[4685]: I0321 04:14:27.120390 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tsjrs_7e226197-b1ce-49a2-a3b9-5aed3d774a12/kube-rbac-proxy/0.log" Mar 21 04:14:27 crc kubenswrapper[4685]: I0321 04:14:27.141793 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tsjrs_7e226197-b1ce-49a2-a3b9-5aed3d774a12/kube-rbac-proxy-frr/0.log" Mar 21 04:14:27 crc kubenswrapper[4685]: I0321 04:14:27.261788 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tsjrs_7e226197-b1ce-49a2-a3b9-5aed3d774a12/reloader/0.log" Mar 21 04:14:27 crc kubenswrapper[4685]: I0321 04:14:27.315925 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-rjxbv_545a6f92-59ae-4ffb-824d-e493044c0082/frr-k8s-webhook-server/0.log" Mar 21 04:14:27 crc kubenswrapper[4685]: I0321 04:14:27.484123 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tsjrs_7e226197-b1ce-49a2-a3b9-5aed3d774a12/frr/0.log" Mar 21 04:14:27 crc kubenswrapper[4685]: I0321 04:14:27.551613 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-9b7d5d78b-jx8nv_ed0eadeb-865c-4742-b429-5f8e0bd67f2b/manager/0.log" Mar 21 04:14:27 crc kubenswrapper[4685]: I0321 04:14:27.613327 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-654575f8df-qj9tz_f2976724-903c-43f4-b917-da8a483a2e9e/webhook-server/0.log" Mar 21 04:14:27 crc kubenswrapper[4685]: I0321 04:14:27.671672 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-cd4d4_dc9a0be8-5b8c-43d8-a670-06541535d7a0/kube-rbac-proxy/0.log" Mar 21 04:14:27 crc kubenswrapper[4685]: I0321 04:14:27.785875 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-cd4d4_dc9a0be8-5b8c-43d8-a670-06541535d7a0/speaker/0.log" Mar 21 04:14:33 crc kubenswrapper[4685]: I0321 04:14:33.301419 4685 scope.go:117] "RemoveContainer" containerID="d777c9ea0ed74e12c330732b76bb70566fa8eb4260f58ef17bc4882bafb2cbc6" Mar 21 04:14:33 crc kubenswrapper[4685]: E0321 04:14:33.302815 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7r9cg_openshift-machine-config-operator(cea46fe2-4e41-43ab-a069-cb30fb4e732c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" podUID="cea46fe2-4e41-43ab-a069-cb30fb4e732c" Mar 21 04:14:39 crc kubenswrapper[4685]: I0321 04:14:39.944011 4685 scope.go:117] "RemoveContainer" containerID="888486b10a5a1c291db786c174630f8cb84a6c68276f6917d0e8e43c51417d19" Mar 21 04:14:48 crc kubenswrapper[4685]: I0321 04:14:48.303934 4685 scope.go:117] "RemoveContainer" containerID="d777c9ea0ed74e12c330732b76bb70566fa8eb4260f58ef17bc4882bafb2cbc6" Mar 21 04:14:48 crc kubenswrapper[4685]: E0321 04:14:48.304631 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7r9cg_openshift-machine-config-operator(cea46fe2-4e41-43ab-a069-cb30fb4e732c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" podUID="cea46fe2-4e41-43ab-a069-cb30fb4e732c" Mar 21 04:14:51 crc kubenswrapper[4685]: I0321 04:14:51.408826 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rf7bs_8cbeffba-f99c-488d-b0db-1cf3b8e31823/util/0.log" Mar 21 04:14:51 crc kubenswrapper[4685]: I0321 04:14:51.596789 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rf7bs_8cbeffba-f99c-488d-b0db-1cf3b8e31823/util/0.log" Mar 21 04:14:51 crc kubenswrapper[4685]: I0321 04:14:51.606767 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rf7bs_8cbeffba-f99c-488d-b0db-1cf3b8e31823/pull/0.log" Mar 21 04:14:51 crc kubenswrapper[4685]: I0321 04:14:51.729556 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rf7bs_8cbeffba-f99c-488d-b0db-1cf3b8e31823/pull/0.log" Mar 21 04:14:51 crc kubenswrapper[4685]: I0321 04:14:51.898280 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rf7bs_8cbeffba-f99c-488d-b0db-1cf3b8e31823/util/0.log" Mar 21 04:14:51 crc kubenswrapper[4685]: I0321 04:14:51.932685 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rf7bs_8cbeffba-f99c-488d-b0db-1cf3b8e31823/extract/0.log" Mar 21 04:14:51 crc kubenswrapper[4685]: I0321 04:14:51.961566 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rf7bs_8cbeffba-f99c-488d-b0db-1cf3b8e31823/pull/0.log" Mar 21 04:14:52 crc kubenswrapper[4685]: I0321 04:14:52.042196 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-h85m6_43541153-b685-4759-bedf-261b2936431d/extract-utilities/0.log" Mar 21 04:14:52 crc kubenswrapper[4685]: I0321 04:14:52.259708 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-h85m6_43541153-b685-4759-bedf-261b2936431d/extract-content/0.log" Mar 21 04:14:52 crc kubenswrapper[4685]: I0321 04:14:52.261183 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-h85m6_43541153-b685-4759-bedf-261b2936431d/extract-content/0.log" Mar 21 04:14:52 crc kubenswrapper[4685]: I0321 04:14:52.273113 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-h85m6_43541153-b685-4759-bedf-261b2936431d/extract-utilities/0.log" Mar 21 04:14:52 crc kubenswrapper[4685]: I0321 04:14:52.419393 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-h85m6_43541153-b685-4759-bedf-261b2936431d/extract-utilities/0.log" Mar 21 04:14:52 crc kubenswrapper[4685]: I0321 04:14:52.459995 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-h85m6_43541153-b685-4759-bedf-261b2936431d/extract-content/0.log" Mar 21 04:14:52 crc kubenswrapper[4685]: I0321 04:14:52.622713 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gn69w_8038baba-1420-4797-9752-5490c0940929/extract-utilities/0.log" Mar 21 04:14:52 crc kubenswrapper[4685]: I0321 04:14:52.733897 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gn69w_8038baba-1420-4797-9752-5490c0940929/extract-utilities/0.log" Mar 21 04:14:52 crc kubenswrapper[4685]: I0321 04:14:52.736072 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-h85m6_43541153-b685-4759-bedf-261b2936431d/registry-server/0.log" Mar 21 04:14:52 crc kubenswrapper[4685]: I0321 04:14:52.749214 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gn69w_8038baba-1420-4797-9752-5490c0940929/extract-content/0.log" Mar 21 04:14:52 crc kubenswrapper[4685]: I0321 04:14:52.798880 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gn69w_8038baba-1420-4797-9752-5490c0940929/extract-content/0.log" Mar 21 04:14:52 crc kubenswrapper[4685]: I0321 04:14:52.960441 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gn69w_8038baba-1420-4797-9752-5490c0940929/extract-content/0.log" Mar 21 04:14:53 crc kubenswrapper[4685]: I0321 04:14:53.006166 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gn69w_8038baba-1420-4797-9752-5490c0940929/extract-utilities/0.log" Mar 21 04:14:53 crc kubenswrapper[4685]: I0321 04:14:53.168673 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-h9g9c_1efd0452-eb45-4336-a0eb-2e171d3da229/marketplace-operator/0.log" Mar 21 04:14:53 crc kubenswrapper[4685]: I0321 04:14:53.218244 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gn69w_8038baba-1420-4797-9752-5490c0940929/registry-server/0.log" Mar 21 04:14:53 crc kubenswrapper[4685]: I0321 04:14:53.251424 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bcc52_c0edc692-b945-418e-8d2e-129f9c88644e/extract-utilities/0.log" Mar 21 04:14:53 crc kubenswrapper[4685]: I0321 04:14:53.372495 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bcc52_c0edc692-b945-418e-8d2e-129f9c88644e/extract-content/0.log" Mar 21 04:14:53 crc kubenswrapper[4685]: I0321 04:14:53.377418 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bcc52_c0edc692-b945-418e-8d2e-129f9c88644e/extract-utilities/0.log" Mar 21 04:14:53 crc kubenswrapper[4685]: I0321 04:14:53.463200 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bcc52_c0edc692-b945-418e-8d2e-129f9c88644e/extract-content/0.log" Mar 21 04:14:53 crc kubenswrapper[4685]: I0321 04:14:53.568614 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bcc52_c0edc692-b945-418e-8d2e-129f9c88644e/extract-content/0.log" Mar 21 04:14:53 crc kubenswrapper[4685]: I0321 04:14:53.574490 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bcc52_c0edc692-b945-418e-8d2e-129f9c88644e/extract-utilities/0.log" Mar 21 04:14:53 crc kubenswrapper[4685]: I0321 04:14:53.635962 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bcc52_c0edc692-b945-418e-8d2e-129f9c88644e/registry-server/0.log" Mar 21 04:14:53 crc kubenswrapper[4685]: I0321 04:14:53.749950 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ttxzc_5118e92f-b64a-4a3b-b9e7-3902c745dbdd/extract-utilities/0.log" Mar 21 04:14:53 crc kubenswrapper[4685]: I0321 04:14:53.895990 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ttxzc_5118e92f-b64a-4a3b-b9e7-3902c745dbdd/extract-content/0.log" Mar 21 04:14:53 crc kubenswrapper[4685]: I0321 04:14:53.915353 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ttxzc_5118e92f-b64a-4a3b-b9e7-3902c745dbdd/extract-utilities/0.log" Mar 21 04:14:53 crc kubenswrapper[4685]: I0321 04:14:53.925161 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ttxzc_5118e92f-b64a-4a3b-b9e7-3902c745dbdd/extract-content/0.log" Mar 21 04:14:54 crc kubenswrapper[4685]: I0321 04:14:54.075460 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ttxzc_5118e92f-b64a-4a3b-b9e7-3902c745dbdd/extract-content/0.log" Mar 21 04:14:54 crc kubenswrapper[4685]: I0321 04:14:54.098380 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ttxzc_5118e92f-b64a-4a3b-b9e7-3902c745dbdd/extract-utilities/0.log" Mar 21 04:14:54 crc kubenswrapper[4685]: I0321 04:14:54.427741 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ttxzc_5118e92f-b64a-4a3b-b9e7-3902c745dbdd/registry-server/0.log" Mar 21 04:15:00 crc kubenswrapper[4685]: I0321 04:15:00.136696 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567775-79kdp"] Mar 21 04:15:00 crc kubenswrapper[4685]: E0321 04:15:00.137505 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ba66306-04db-4c93-980e-680dd8410a44" containerName="oc" Mar 21 04:15:00 crc kubenswrapper[4685]: I0321 04:15:00.137520 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ba66306-04db-4c93-980e-680dd8410a44" containerName="oc" Mar 21 04:15:00 crc kubenswrapper[4685]: I0321 04:15:00.137635 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ba66306-04db-4c93-980e-680dd8410a44" containerName="oc" Mar 21 04:15:00 crc kubenswrapper[4685]: I0321 04:15:00.138002 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567775-79kdp" Mar 21 04:15:00 crc kubenswrapper[4685]: I0321 04:15:00.139443 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/20bd4ddc-7793-4fe4-8413-277946533858-secret-volume\") pod \"collect-profiles-29567775-79kdp\" (UID: \"20bd4ddc-7793-4fe4-8413-277946533858\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567775-79kdp" Mar 21 04:15:00 crc kubenswrapper[4685]: I0321 04:15:00.139500 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/20bd4ddc-7793-4fe4-8413-277946533858-config-volume\") pod \"collect-profiles-29567775-79kdp\" (UID: \"20bd4ddc-7793-4fe4-8413-277946533858\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567775-79kdp" Mar 21 04:15:00 crc kubenswrapper[4685]: I0321 04:15:00.139536 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bm9m9\" (UniqueName: \"kubernetes.io/projected/20bd4ddc-7793-4fe4-8413-277946533858-kube-api-access-bm9m9\") pod \"collect-profiles-29567775-79kdp\" (UID: \"20bd4ddc-7793-4fe4-8413-277946533858\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567775-79kdp" Mar 21 04:15:00 crc kubenswrapper[4685]: I0321 04:15:00.140470 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 21 04:15:00 crc kubenswrapper[4685]: I0321 04:15:00.142059 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 21 04:15:00 crc kubenswrapper[4685]: I0321 04:15:00.145915 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567775-79kdp"] Mar 21 04:15:00 crc kubenswrapper[4685]: I0321 04:15:00.240968 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/20bd4ddc-7793-4fe4-8413-277946533858-secret-volume\") pod \"collect-profiles-29567775-79kdp\" (UID: \"20bd4ddc-7793-4fe4-8413-277946533858\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567775-79kdp" Mar 21 04:15:00 crc kubenswrapper[4685]: I0321 04:15:00.241025 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/20bd4ddc-7793-4fe4-8413-277946533858-config-volume\") pod \"collect-profiles-29567775-79kdp\" (UID: \"20bd4ddc-7793-4fe4-8413-277946533858\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567775-79kdp" Mar 21 04:15:00 crc kubenswrapper[4685]: I0321 04:15:00.241048 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bm9m9\" (UniqueName: \"kubernetes.io/projected/20bd4ddc-7793-4fe4-8413-277946533858-kube-api-access-bm9m9\") pod \"collect-profiles-29567775-79kdp\" (UID: \"20bd4ddc-7793-4fe4-8413-277946533858\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567775-79kdp" Mar 21 04:15:00 crc kubenswrapper[4685]: I0321 04:15:00.242114 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/20bd4ddc-7793-4fe4-8413-277946533858-config-volume\") pod \"collect-profiles-29567775-79kdp\" (UID: \"20bd4ddc-7793-4fe4-8413-277946533858\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567775-79kdp" Mar 21 04:15:00 crc kubenswrapper[4685]: I0321 04:15:00.254234 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/20bd4ddc-7793-4fe4-8413-277946533858-secret-volume\") pod \"collect-profiles-29567775-79kdp\" (UID: \"20bd4ddc-7793-4fe4-8413-277946533858\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567775-79kdp" Mar 21 04:15:00 crc kubenswrapper[4685]: I0321 04:15:00.256185 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bm9m9\" (UniqueName: \"kubernetes.io/projected/20bd4ddc-7793-4fe4-8413-277946533858-kube-api-access-bm9m9\") pod \"collect-profiles-29567775-79kdp\" (UID: \"20bd4ddc-7793-4fe4-8413-277946533858\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567775-79kdp" Mar 21 04:15:00 crc kubenswrapper[4685]: I0321 04:15:00.458884 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567775-79kdp" Mar 21 04:15:00 crc kubenswrapper[4685]: I0321 04:15:00.877474 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567775-79kdp"] Mar 21 04:15:00 crc kubenswrapper[4685]: I0321 04:15:00.971003 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567775-79kdp" event={"ID":"20bd4ddc-7793-4fe4-8413-277946533858","Type":"ContainerStarted","Data":"83eb1186441fb055a53470cf3b90ff87b0e0d160f376eff0c91a7fea31bafd2f"} Mar 21 04:15:01 crc kubenswrapper[4685]: I0321 04:15:01.301127 4685 scope.go:117] "RemoveContainer" containerID="d777c9ea0ed74e12c330732b76bb70566fa8eb4260f58ef17bc4882bafb2cbc6" Mar 21 04:15:01 crc kubenswrapper[4685]: E0321 04:15:01.301724 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7r9cg_openshift-machine-config-operator(cea46fe2-4e41-43ab-a069-cb30fb4e732c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" podUID="cea46fe2-4e41-43ab-a069-cb30fb4e732c" Mar 21 04:15:01 crc kubenswrapper[4685]: I0321 04:15:01.977515 4685 generic.go:334] "Generic (PLEG): container finished" podID="20bd4ddc-7793-4fe4-8413-277946533858" containerID="b7db8f1760ff3eeb89ec0b7b640adb356f777af1d076006c539288920b9ae6d8" exitCode=0 Mar 21 04:15:01 crc kubenswrapper[4685]: I0321 04:15:01.977576 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567775-79kdp" event={"ID":"20bd4ddc-7793-4fe4-8413-277946533858","Type":"ContainerDied","Data":"b7db8f1760ff3eeb89ec0b7b640adb356f777af1d076006c539288920b9ae6d8"} Mar 21 04:15:03 crc kubenswrapper[4685]: I0321 04:15:03.213882 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567775-79kdp" Mar 21 04:15:03 crc kubenswrapper[4685]: I0321 04:15:03.278310 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bm9m9\" (UniqueName: \"kubernetes.io/projected/20bd4ddc-7793-4fe4-8413-277946533858-kube-api-access-bm9m9\") pod \"20bd4ddc-7793-4fe4-8413-277946533858\" (UID: \"20bd4ddc-7793-4fe4-8413-277946533858\") " Mar 21 04:15:03 crc kubenswrapper[4685]: I0321 04:15:03.278410 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/20bd4ddc-7793-4fe4-8413-277946533858-secret-volume\") pod \"20bd4ddc-7793-4fe4-8413-277946533858\" (UID: \"20bd4ddc-7793-4fe4-8413-277946533858\") " Mar 21 04:15:03 crc kubenswrapper[4685]: I0321 04:15:03.278502 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/20bd4ddc-7793-4fe4-8413-277946533858-config-volume\") pod \"20bd4ddc-7793-4fe4-8413-277946533858\" (UID: \"20bd4ddc-7793-4fe4-8413-277946533858\") " Mar 21 04:15:03 crc kubenswrapper[4685]: I0321 04:15:03.279246 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20bd4ddc-7793-4fe4-8413-277946533858-config-volume" (OuterVolumeSpecName: "config-volume") pod "20bd4ddc-7793-4fe4-8413-277946533858" (UID: "20bd4ddc-7793-4fe4-8413-277946533858"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:15:03 crc kubenswrapper[4685]: I0321 04:15:03.284406 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20bd4ddc-7793-4fe4-8413-277946533858-kube-api-access-bm9m9" (OuterVolumeSpecName: "kube-api-access-bm9m9") pod "20bd4ddc-7793-4fe4-8413-277946533858" (UID: "20bd4ddc-7793-4fe4-8413-277946533858"). InnerVolumeSpecName "kube-api-access-bm9m9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:15:03 crc kubenswrapper[4685]: I0321 04:15:03.285057 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20bd4ddc-7793-4fe4-8413-277946533858-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "20bd4ddc-7793-4fe4-8413-277946533858" (UID: "20bd4ddc-7793-4fe4-8413-277946533858"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:15:03 crc kubenswrapper[4685]: I0321 04:15:03.380593 4685 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/20bd4ddc-7793-4fe4-8413-277946533858-config-volume\") on node \"crc\" DevicePath \"\"" Mar 21 04:15:03 crc kubenswrapper[4685]: I0321 04:15:03.380762 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bm9m9\" (UniqueName: \"kubernetes.io/projected/20bd4ddc-7793-4fe4-8413-277946533858-kube-api-access-bm9m9\") on node \"crc\" DevicePath \"\"" Mar 21 04:15:03 crc kubenswrapper[4685]: I0321 04:15:03.380778 4685 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/20bd4ddc-7793-4fe4-8413-277946533858-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 21 04:15:03 crc kubenswrapper[4685]: I0321 04:15:03.989880 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567775-79kdp" event={"ID":"20bd4ddc-7793-4fe4-8413-277946533858","Type":"ContainerDied","Data":"83eb1186441fb055a53470cf3b90ff87b0e0d160f376eff0c91a7fea31bafd2f"} Mar 21 04:15:03 crc kubenswrapper[4685]: I0321 04:15:03.989922 4685 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83eb1186441fb055a53470cf3b90ff87b0e0d160f376eff0c91a7fea31bafd2f" Mar 21 04:15:03 crc kubenswrapper[4685]: I0321 04:15:03.989979 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567775-79kdp" Mar 21 04:15:11 crc kubenswrapper[4685]: I0321 04:15:11.660128 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-t46dv"] Mar 21 04:15:11 crc kubenswrapper[4685]: E0321 04:15:11.661236 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20bd4ddc-7793-4fe4-8413-277946533858" containerName="collect-profiles" Mar 21 04:15:11 crc kubenswrapper[4685]: I0321 04:15:11.661261 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="20bd4ddc-7793-4fe4-8413-277946533858" containerName="collect-profiles" Mar 21 04:15:11 crc kubenswrapper[4685]: I0321 04:15:11.664135 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="20bd4ddc-7793-4fe4-8413-277946533858" containerName="collect-profiles" Mar 21 04:15:11 crc kubenswrapper[4685]: I0321 04:15:11.665557 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t46dv" Mar 21 04:15:11 crc kubenswrapper[4685]: I0321 04:15:11.674048 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-t46dv"] Mar 21 04:15:11 crc kubenswrapper[4685]: I0321 04:15:11.786679 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3f52efa-3056-4ba2-968c-e76e0d97d5c9-catalog-content\") pod \"community-operators-t46dv\" (UID: \"d3f52efa-3056-4ba2-968c-e76e0d97d5c9\") " pod="openshift-marketplace/community-operators-t46dv" Mar 21 04:15:11 crc kubenswrapper[4685]: I0321 04:15:11.787038 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjnqd\" (UniqueName: \"kubernetes.io/projected/d3f52efa-3056-4ba2-968c-e76e0d97d5c9-kube-api-access-mjnqd\") pod \"community-operators-t46dv\" (UID: \"d3f52efa-3056-4ba2-968c-e76e0d97d5c9\") " pod="openshift-marketplace/community-operators-t46dv" Mar 21 04:15:11 crc kubenswrapper[4685]: I0321 04:15:11.787117 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3f52efa-3056-4ba2-968c-e76e0d97d5c9-utilities\") pod \"community-operators-t46dv\" (UID: \"d3f52efa-3056-4ba2-968c-e76e0d97d5c9\") " pod="openshift-marketplace/community-operators-t46dv" Mar 21 04:15:11 crc kubenswrapper[4685]: I0321 04:15:11.887918 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3f52efa-3056-4ba2-968c-e76e0d97d5c9-utilities\") pod \"community-operators-t46dv\" (UID: \"d3f52efa-3056-4ba2-968c-e76e0d97d5c9\") " pod="openshift-marketplace/community-operators-t46dv" Mar 21 04:15:11 crc kubenswrapper[4685]: I0321 04:15:11.887987 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3f52efa-3056-4ba2-968c-e76e0d97d5c9-catalog-content\") pod \"community-operators-t46dv\" (UID: \"d3f52efa-3056-4ba2-968c-e76e0d97d5c9\") " pod="openshift-marketplace/community-operators-t46dv" Mar 21 04:15:11 crc kubenswrapper[4685]: I0321 04:15:11.888047 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjnqd\" (UniqueName: \"kubernetes.io/projected/d3f52efa-3056-4ba2-968c-e76e0d97d5c9-kube-api-access-mjnqd\") pod \"community-operators-t46dv\" (UID: \"d3f52efa-3056-4ba2-968c-e76e0d97d5c9\") " pod="openshift-marketplace/community-operators-t46dv" Mar 21 04:15:11 crc kubenswrapper[4685]: I0321 04:15:11.888395 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3f52efa-3056-4ba2-968c-e76e0d97d5c9-utilities\") pod \"community-operators-t46dv\" (UID: \"d3f52efa-3056-4ba2-968c-e76e0d97d5c9\") " pod="openshift-marketplace/community-operators-t46dv" Mar 21 04:15:11 crc kubenswrapper[4685]: I0321 04:15:11.888435 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3f52efa-3056-4ba2-968c-e76e0d97d5c9-catalog-content\") pod \"community-operators-t46dv\" (UID: \"d3f52efa-3056-4ba2-968c-e76e0d97d5c9\") " pod="openshift-marketplace/community-operators-t46dv" Mar 21 04:15:11 crc kubenswrapper[4685]: I0321 04:15:11.908254 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjnqd\" (UniqueName: \"kubernetes.io/projected/d3f52efa-3056-4ba2-968c-e76e0d97d5c9-kube-api-access-mjnqd\") pod \"community-operators-t46dv\" (UID: \"d3f52efa-3056-4ba2-968c-e76e0d97d5c9\") " pod="openshift-marketplace/community-operators-t46dv" Mar 21 04:15:11 crc kubenswrapper[4685]: I0321 04:15:11.983350 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t46dv" Mar 21 04:15:12 crc kubenswrapper[4685]: I0321 04:15:12.245769 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-t46dv"] Mar 21 04:15:12 crc kubenswrapper[4685]: I0321 04:15:12.301534 4685 scope.go:117] "RemoveContainer" containerID="d777c9ea0ed74e12c330732b76bb70566fa8eb4260f58ef17bc4882bafb2cbc6" Mar 21 04:15:12 crc kubenswrapper[4685]: E0321 04:15:12.301766 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7r9cg_openshift-machine-config-operator(cea46fe2-4e41-43ab-a069-cb30fb4e732c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" podUID="cea46fe2-4e41-43ab-a069-cb30fb4e732c" Mar 21 04:15:13 crc kubenswrapper[4685]: I0321 04:15:13.065505 4685 generic.go:334] "Generic (PLEG): container finished" podID="d3f52efa-3056-4ba2-968c-e76e0d97d5c9" containerID="506ef66a2707ae9644bb640d4c431499ff723c6249ccf8327e7a2998736ca3bc" exitCode=0 Mar 21 04:15:13 crc kubenswrapper[4685]: I0321 04:15:13.065556 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t46dv" event={"ID":"d3f52efa-3056-4ba2-968c-e76e0d97d5c9","Type":"ContainerDied","Data":"506ef66a2707ae9644bb640d4c431499ff723c6249ccf8327e7a2998736ca3bc"} Mar 21 04:15:13 crc kubenswrapper[4685]: I0321 04:15:13.065584 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t46dv" event={"ID":"d3f52efa-3056-4ba2-968c-e76e0d97d5c9","Type":"ContainerStarted","Data":"d9e2e5e8f80fe4b46c9b20225f5351e1eab41d17fd585d4623922de5d3b60ccc"} Mar 21 04:15:15 crc kubenswrapper[4685]: I0321 04:15:15.079911 4685 generic.go:334] "Generic (PLEG): container finished" podID="d3f52efa-3056-4ba2-968c-e76e0d97d5c9" containerID="3807ee31570ab9739ce7a604bc70b0ba8a58e05725833334d69210bd6ebddfcc" exitCode=0 Mar 21 04:15:15 crc kubenswrapper[4685]: I0321 04:15:15.079976 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t46dv" event={"ID":"d3f52efa-3056-4ba2-968c-e76e0d97d5c9","Type":"ContainerDied","Data":"3807ee31570ab9739ce7a604bc70b0ba8a58e05725833334d69210bd6ebddfcc"} Mar 21 04:15:17 crc kubenswrapper[4685]: I0321 04:15:17.099323 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t46dv" event={"ID":"d3f52efa-3056-4ba2-968c-e76e0d97d5c9","Type":"ContainerStarted","Data":"7cd0cd488b43e34067dc1a0de6cb0718f9d215fe0fc1dba0092605ba8444bfb3"} Mar 21 04:15:17 crc kubenswrapper[4685]: I0321 04:15:17.118998 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-t46dv" podStartSLOduration=3.3672931520000002 podStartE2EDuration="6.118974855s" podCreationTimestamp="2026-03-21 04:15:11 +0000 UTC" firstStartedPulling="2026-03-21 04:15:13.067458785 +0000 UTC m=+1745.544527577" lastFinishedPulling="2026-03-21 04:15:15.819140468 +0000 UTC m=+1748.296209280" observedRunningTime="2026-03-21 04:15:17.113801867 +0000 UTC m=+1749.590870669" watchObservedRunningTime="2026-03-21 04:15:17.118974855 +0000 UTC m=+1749.596043647" Mar 21 04:15:21 crc kubenswrapper[4685]: I0321 04:15:21.983775 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-t46dv" Mar 21 04:15:21 crc kubenswrapper[4685]: I0321 04:15:21.984284 4685 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-t46dv" Mar 21 04:15:22 crc kubenswrapper[4685]: I0321 04:15:22.024963 4685 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-t46dv" Mar 21 04:15:22 crc kubenswrapper[4685]: I0321 04:15:22.183417 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-t46dv" Mar 21 04:15:22 crc kubenswrapper[4685]: I0321 04:15:22.263666 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-t46dv"] Mar 21 04:15:23 crc kubenswrapper[4685]: I0321 04:15:23.301173 4685 scope.go:117] "RemoveContainer" containerID="d777c9ea0ed74e12c330732b76bb70566fa8eb4260f58ef17bc4882bafb2cbc6" Mar 21 04:15:23 crc kubenswrapper[4685]: E0321 04:15:23.301345 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7r9cg_openshift-machine-config-operator(cea46fe2-4e41-43ab-a069-cb30fb4e732c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" podUID="cea46fe2-4e41-43ab-a069-cb30fb4e732c" Mar 21 04:15:24 crc kubenswrapper[4685]: I0321 04:15:24.145553 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-t46dv" podUID="d3f52efa-3056-4ba2-968c-e76e0d97d5c9" containerName="registry-server" containerID="cri-o://7cd0cd488b43e34067dc1a0de6cb0718f9d215fe0fc1dba0092605ba8444bfb3" gracePeriod=2 Mar 21 04:15:25 crc kubenswrapper[4685]: I0321 04:15:25.070135 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t46dv" Mar 21 04:15:25 crc kubenswrapper[4685]: I0321 04:15:25.155174 4685 generic.go:334] "Generic (PLEG): container finished" podID="d3f52efa-3056-4ba2-968c-e76e0d97d5c9" containerID="7cd0cd488b43e34067dc1a0de6cb0718f9d215fe0fc1dba0092605ba8444bfb3" exitCode=0 Mar 21 04:15:25 crc kubenswrapper[4685]: I0321 04:15:25.155616 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t46dv" event={"ID":"d3f52efa-3056-4ba2-968c-e76e0d97d5c9","Type":"ContainerDied","Data":"7cd0cd488b43e34067dc1a0de6cb0718f9d215fe0fc1dba0092605ba8444bfb3"} Mar 21 04:15:25 crc kubenswrapper[4685]: I0321 04:15:25.155783 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t46dv" event={"ID":"d3f52efa-3056-4ba2-968c-e76e0d97d5c9","Type":"ContainerDied","Data":"d9e2e5e8f80fe4b46c9b20225f5351e1eab41d17fd585d4623922de5d3b60ccc"} Mar 21 04:15:25 crc kubenswrapper[4685]: I0321 04:15:25.156146 4685 scope.go:117] "RemoveContainer" containerID="7cd0cd488b43e34067dc1a0de6cb0718f9d215fe0fc1dba0092605ba8444bfb3" Mar 21 04:15:25 crc kubenswrapper[4685]: I0321 04:15:25.156883 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t46dv" Mar 21 04:15:25 crc kubenswrapper[4685]: I0321 04:15:25.173172 4685 scope.go:117] "RemoveContainer" containerID="3807ee31570ab9739ce7a604bc70b0ba8a58e05725833334d69210bd6ebddfcc" Mar 21 04:15:25 crc kubenswrapper[4685]: I0321 04:15:25.187336 4685 scope.go:117] "RemoveContainer" containerID="506ef66a2707ae9644bb640d4c431499ff723c6249ccf8327e7a2998736ca3bc" Mar 21 04:15:25 crc kubenswrapper[4685]: I0321 04:15:25.203429 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjnqd\" (UniqueName: \"kubernetes.io/projected/d3f52efa-3056-4ba2-968c-e76e0d97d5c9-kube-api-access-mjnqd\") pod \"d3f52efa-3056-4ba2-968c-e76e0d97d5c9\" (UID: \"d3f52efa-3056-4ba2-968c-e76e0d97d5c9\") " Mar 21 04:15:25 crc kubenswrapper[4685]: I0321 04:15:25.203529 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3f52efa-3056-4ba2-968c-e76e0d97d5c9-utilities\") pod \"d3f52efa-3056-4ba2-968c-e76e0d97d5c9\" (UID: \"d3f52efa-3056-4ba2-968c-e76e0d97d5c9\") " Mar 21 04:15:25 crc kubenswrapper[4685]: I0321 04:15:25.203569 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3f52efa-3056-4ba2-968c-e76e0d97d5c9-catalog-content\") pod \"d3f52efa-3056-4ba2-968c-e76e0d97d5c9\" (UID: \"d3f52efa-3056-4ba2-968c-e76e0d97d5c9\") " Mar 21 04:15:25 crc kubenswrapper[4685]: I0321 04:15:25.204187 4685 scope.go:117] "RemoveContainer" containerID="7cd0cd488b43e34067dc1a0de6cb0718f9d215fe0fc1dba0092605ba8444bfb3" Mar 21 04:15:25 crc kubenswrapper[4685]: E0321 04:15:25.204631 4685 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7cd0cd488b43e34067dc1a0de6cb0718f9d215fe0fc1dba0092605ba8444bfb3\": container with ID starting with 7cd0cd488b43e34067dc1a0de6cb0718f9d215fe0fc1dba0092605ba8444bfb3 not found: ID does not exist" containerID="7cd0cd488b43e34067dc1a0de6cb0718f9d215fe0fc1dba0092605ba8444bfb3" Mar 21 04:15:25 crc kubenswrapper[4685]: I0321 04:15:25.204701 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cd0cd488b43e34067dc1a0de6cb0718f9d215fe0fc1dba0092605ba8444bfb3"} err="failed to get container status \"7cd0cd488b43e34067dc1a0de6cb0718f9d215fe0fc1dba0092605ba8444bfb3\": rpc error: code = NotFound desc = could not find container \"7cd0cd488b43e34067dc1a0de6cb0718f9d215fe0fc1dba0092605ba8444bfb3\": container with ID starting with 7cd0cd488b43e34067dc1a0de6cb0718f9d215fe0fc1dba0092605ba8444bfb3 not found: ID does not exist" Mar 21 04:15:25 crc kubenswrapper[4685]: I0321 04:15:25.204744 4685 scope.go:117] "RemoveContainer" containerID="3807ee31570ab9739ce7a604bc70b0ba8a58e05725833334d69210bd6ebddfcc" Mar 21 04:15:25 crc kubenswrapper[4685]: E0321 04:15:25.205072 4685 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3807ee31570ab9739ce7a604bc70b0ba8a58e05725833334d69210bd6ebddfcc\": container with ID starting with 3807ee31570ab9739ce7a604bc70b0ba8a58e05725833334d69210bd6ebddfcc not found: ID does not exist" containerID="3807ee31570ab9739ce7a604bc70b0ba8a58e05725833334d69210bd6ebddfcc" Mar 21 04:15:25 crc kubenswrapper[4685]: I0321 04:15:25.205242 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3807ee31570ab9739ce7a604bc70b0ba8a58e05725833334d69210bd6ebddfcc"} err="failed to get container status \"3807ee31570ab9739ce7a604bc70b0ba8a58e05725833334d69210bd6ebddfcc\": rpc error: code = NotFound desc = could not find container \"3807ee31570ab9739ce7a604bc70b0ba8a58e05725833334d69210bd6ebddfcc\": container with ID starting with 3807ee31570ab9739ce7a604bc70b0ba8a58e05725833334d69210bd6ebddfcc not found: ID does not exist" Mar 21 04:15:25 crc kubenswrapper[4685]: I0321 04:15:25.205343 4685 scope.go:117] "RemoveContainer" containerID="506ef66a2707ae9644bb640d4c431499ff723c6249ccf8327e7a2998736ca3bc" Mar 21 04:15:25 crc kubenswrapper[4685]: E0321 04:15:25.205612 4685 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"506ef66a2707ae9644bb640d4c431499ff723c6249ccf8327e7a2998736ca3bc\": container with ID starting with 506ef66a2707ae9644bb640d4c431499ff723c6249ccf8327e7a2998736ca3bc not found: ID does not exist" containerID="506ef66a2707ae9644bb640d4c431499ff723c6249ccf8327e7a2998736ca3bc" Mar 21 04:15:25 crc kubenswrapper[4685]: I0321 04:15:25.205655 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"506ef66a2707ae9644bb640d4c431499ff723c6249ccf8327e7a2998736ca3bc"} err="failed to get container status \"506ef66a2707ae9644bb640d4c431499ff723c6249ccf8327e7a2998736ca3bc\": rpc error: code = NotFound desc = could not find container \"506ef66a2707ae9644bb640d4c431499ff723c6249ccf8327e7a2998736ca3bc\": container with ID starting with 506ef66a2707ae9644bb640d4c431499ff723c6249ccf8327e7a2998736ca3bc not found: ID does not exist" Mar 21 04:15:25 crc kubenswrapper[4685]: I0321 04:15:25.206132 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3f52efa-3056-4ba2-968c-e76e0d97d5c9-utilities" (OuterVolumeSpecName: "utilities") pod "d3f52efa-3056-4ba2-968c-e76e0d97d5c9" (UID: "d3f52efa-3056-4ba2-968c-e76e0d97d5c9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:15:25 crc kubenswrapper[4685]: I0321 04:15:25.212802 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3f52efa-3056-4ba2-968c-e76e0d97d5c9-kube-api-access-mjnqd" (OuterVolumeSpecName: "kube-api-access-mjnqd") pod "d3f52efa-3056-4ba2-968c-e76e0d97d5c9" (UID: "d3f52efa-3056-4ba2-968c-e76e0d97d5c9"). InnerVolumeSpecName "kube-api-access-mjnqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:15:25 crc kubenswrapper[4685]: I0321 04:15:25.256629 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3f52efa-3056-4ba2-968c-e76e0d97d5c9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d3f52efa-3056-4ba2-968c-e76e0d97d5c9" (UID: "d3f52efa-3056-4ba2-968c-e76e0d97d5c9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:15:25 crc kubenswrapper[4685]: I0321 04:15:25.305696 4685 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3f52efa-3056-4ba2-968c-e76e0d97d5c9-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 04:15:25 crc kubenswrapper[4685]: I0321 04:15:25.305769 4685 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3f52efa-3056-4ba2-968c-e76e0d97d5c9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 04:15:25 crc kubenswrapper[4685]: I0321 04:15:25.305791 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjnqd\" (UniqueName: \"kubernetes.io/projected/d3f52efa-3056-4ba2-968c-e76e0d97d5c9-kube-api-access-mjnqd\") on node \"crc\" DevicePath \"\"" Mar 21 04:15:25 crc kubenswrapper[4685]: I0321 04:15:25.494587 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-t46dv"] Mar 21 04:15:25 crc kubenswrapper[4685]: I0321 04:15:25.505784 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-t46dv"] Mar 21 04:15:26 crc kubenswrapper[4685]: I0321 04:15:26.314462 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3f52efa-3056-4ba2-968c-e76e0d97d5c9" path="/var/lib/kubelet/pods/d3f52efa-3056-4ba2-968c-e76e0d97d5c9/volumes" Mar 21 04:15:34 crc kubenswrapper[4685]: I0321 04:15:34.305238 4685 scope.go:117] "RemoveContainer" containerID="d777c9ea0ed74e12c330732b76bb70566fa8eb4260f58ef17bc4882bafb2cbc6" Mar 21 04:15:34 crc kubenswrapper[4685]: E0321 04:15:34.306106 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7r9cg_openshift-machine-config-operator(cea46fe2-4e41-43ab-a069-cb30fb4e732c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" podUID="cea46fe2-4e41-43ab-a069-cb30fb4e732c" Mar 21 04:15:46 crc kubenswrapper[4685]: I0321 04:15:46.300865 4685 scope.go:117] "RemoveContainer" containerID="d777c9ea0ed74e12c330732b76bb70566fa8eb4260f58ef17bc4882bafb2cbc6" Mar 21 04:15:46 crc kubenswrapper[4685]: E0321 04:15:46.301601 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7r9cg_openshift-machine-config-operator(cea46fe2-4e41-43ab-a069-cb30fb4e732c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" podUID="cea46fe2-4e41-43ab-a069-cb30fb4e732c" Mar 21 04:16:00 crc kubenswrapper[4685]: I0321 04:16:00.132686 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567776-gk99r"] Mar 21 04:16:00 crc kubenswrapper[4685]: E0321 04:16:00.133545 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3f52efa-3056-4ba2-968c-e76e0d97d5c9" containerName="extract-content" Mar 21 04:16:00 crc kubenswrapper[4685]: I0321 04:16:00.133562 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3f52efa-3056-4ba2-968c-e76e0d97d5c9" containerName="extract-content" Mar 21 04:16:00 crc kubenswrapper[4685]: E0321 04:16:00.133577 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3f52efa-3056-4ba2-968c-e76e0d97d5c9" containerName="extract-utilities" Mar 21 04:16:00 crc kubenswrapper[4685]: I0321 04:16:00.133588 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3f52efa-3056-4ba2-968c-e76e0d97d5c9" containerName="extract-utilities" Mar 21 04:16:00 crc kubenswrapper[4685]: E0321 04:16:00.133603 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3f52efa-3056-4ba2-968c-e76e0d97d5c9" containerName="registry-server" Mar 21 04:16:00 crc kubenswrapper[4685]: I0321 04:16:00.133612 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3f52efa-3056-4ba2-968c-e76e0d97d5c9" containerName="registry-server" Mar 21 04:16:00 crc kubenswrapper[4685]: I0321 04:16:00.133746 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3f52efa-3056-4ba2-968c-e76e0d97d5c9" containerName="registry-server" Mar 21 04:16:00 crc kubenswrapper[4685]: I0321 04:16:00.134202 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567776-gk99r" Mar 21 04:16:00 crc kubenswrapper[4685]: I0321 04:16:00.136269 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 04:16:00 crc kubenswrapper[4685]: I0321 04:16:00.138598 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567776-gk99r"] Mar 21 04:16:00 crc kubenswrapper[4685]: I0321 04:16:00.140089 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k75cc" Mar 21 04:16:00 crc kubenswrapper[4685]: I0321 04:16:00.140289 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 04:16:00 crc kubenswrapper[4685]: I0321 04:16:00.233993 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zk8rz\" (UniqueName: \"kubernetes.io/projected/574a7e26-428c-4869-8643-3d08670b70e8-kube-api-access-zk8rz\") pod \"auto-csr-approver-29567776-gk99r\" (UID: \"574a7e26-428c-4869-8643-3d08670b70e8\") " pod="openshift-infra/auto-csr-approver-29567776-gk99r" Mar 21 04:16:00 crc kubenswrapper[4685]: I0321 04:16:00.301190 4685 scope.go:117] "RemoveContainer" containerID="d777c9ea0ed74e12c330732b76bb70566fa8eb4260f58ef17bc4882bafb2cbc6" Mar 21 04:16:00 crc kubenswrapper[4685]: E0321 04:16:00.301622 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7r9cg_openshift-machine-config-operator(cea46fe2-4e41-43ab-a069-cb30fb4e732c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" podUID="cea46fe2-4e41-43ab-a069-cb30fb4e732c" Mar 21 04:16:00 crc kubenswrapper[4685]: I0321 04:16:00.335702 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zk8rz\" (UniqueName: \"kubernetes.io/projected/574a7e26-428c-4869-8643-3d08670b70e8-kube-api-access-zk8rz\") pod \"auto-csr-approver-29567776-gk99r\" (UID: \"574a7e26-428c-4869-8643-3d08670b70e8\") " pod="openshift-infra/auto-csr-approver-29567776-gk99r" Mar 21 04:16:00 crc kubenswrapper[4685]: I0321 04:16:00.364339 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zk8rz\" (UniqueName: \"kubernetes.io/projected/574a7e26-428c-4869-8643-3d08670b70e8-kube-api-access-zk8rz\") pod \"auto-csr-approver-29567776-gk99r\" (UID: \"574a7e26-428c-4869-8643-3d08670b70e8\") " pod="openshift-infra/auto-csr-approver-29567776-gk99r" Mar 21 04:16:00 crc kubenswrapper[4685]: I0321 04:16:00.454513 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567776-gk99r" Mar 21 04:16:00 crc kubenswrapper[4685]: I0321 04:16:00.693549 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567776-gk99r"] Mar 21 04:16:01 crc kubenswrapper[4685]: I0321 04:16:01.376291 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567776-gk99r" event={"ID":"574a7e26-428c-4869-8643-3d08670b70e8","Type":"ContainerStarted","Data":"53cde9aab61c8727037f45ef26567c6d011f54286a03a0f6df78daf92a67f17d"} Mar 21 04:16:02 crc kubenswrapper[4685]: I0321 04:16:02.405812 4685 generic.go:334] "Generic (PLEG): container finished" podID="574a7e26-428c-4869-8643-3d08670b70e8" containerID="7e53103d882acb2e83a5bacbdad82347da2ee02fbc59b6720b980a621fcf5b92" exitCode=0 Mar 21 04:16:02 crc kubenswrapper[4685]: I0321 04:16:02.405877 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567776-gk99r" event={"ID":"574a7e26-428c-4869-8643-3d08670b70e8","Type":"ContainerDied","Data":"7e53103d882acb2e83a5bacbdad82347da2ee02fbc59b6720b980a621fcf5b92"} Mar 21 04:16:03 crc kubenswrapper[4685]: I0321 04:16:03.648428 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567776-gk99r" Mar 21 04:16:03 crc kubenswrapper[4685]: I0321 04:16:03.694893 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zk8rz\" (UniqueName: \"kubernetes.io/projected/574a7e26-428c-4869-8643-3d08670b70e8-kube-api-access-zk8rz\") pod \"574a7e26-428c-4869-8643-3d08670b70e8\" (UID: \"574a7e26-428c-4869-8643-3d08670b70e8\") " Mar 21 04:16:03 crc kubenswrapper[4685]: I0321 04:16:03.705420 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/574a7e26-428c-4869-8643-3d08670b70e8-kube-api-access-zk8rz" (OuterVolumeSpecName: "kube-api-access-zk8rz") pod "574a7e26-428c-4869-8643-3d08670b70e8" (UID: "574a7e26-428c-4869-8643-3d08670b70e8"). InnerVolumeSpecName "kube-api-access-zk8rz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:16:03 crc kubenswrapper[4685]: I0321 04:16:03.796735 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zk8rz\" (UniqueName: \"kubernetes.io/projected/574a7e26-428c-4869-8643-3d08670b70e8-kube-api-access-zk8rz\") on node \"crc\" DevicePath \"\"" Mar 21 04:16:04 crc kubenswrapper[4685]: I0321 04:16:04.419357 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567776-gk99r" event={"ID":"574a7e26-428c-4869-8643-3d08670b70e8","Type":"ContainerDied","Data":"53cde9aab61c8727037f45ef26567c6d011f54286a03a0f6df78daf92a67f17d"} Mar 21 04:16:04 crc kubenswrapper[4685]: I0321 04:16:04.419592 4685 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="53cde9aab61c8727037f45ef26567c6d011f54286a03a0f6df78daf92a67f17d" Mar 21 04:16:04 crc kubenswrapper[4685]: I0321 04:16:04.419439 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567776-gk99r" Mar 21 04:16:04 crc kubenswrapper[4685]: I0321 04:16:04.702446 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567770-9677p"] Mar 21 04:16:04 crc kubenswrapper[4685]: I0321 04:16:04.706686 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567770-9677p"] Mar 21 04:16:06 crc kubenswrapper[4685]: I0321 04:16:06.307096 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75432731-b3b0-48bb-a0b9-6985398ddaf5" path="/var/lib/kubelet/pods/75432731-b3b0-48bb-a0b9-6985398ddaf5/volumes" Mar 21 04:16:06 crc kubenswrapper[4685]: I0321 04:16:06.445331 4685 generic.go:334] "Generic (PLEG): container finished" podID="26eea1ac-1be6-405e-a606-dadf213577a2" containerID="c55bc7fca4f251b216c79e669b7e9caad787358e25c805dd7819ae55069f924c" exitCode=0 Mar 21 04:16:06 crc kubenswrapper[4685]: I0321 04:16:06.445378 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pvlwj/must-gather-rv9gg" event={"ID":"26eea1ac-1be6-405e-a606-dadf213577a2","Type":"ContainerDied","Data":"c55bc7fca4f251b216c79e669b7e9caad787358e25c805dd7819ae55069f924c"} Mar 21 04:16:06 crc kubenswrapper[4685]: I0321 04:16:06.445941 4685 scope.go:117] "RemoveContainer" containerID="c55bc7fca4f251b216c79e669b7e9caad787358e25c805dd7819ae55069f924c" Mar 21 04:16:06 crc kubenswrapper[4685]: I0321 04:16:06.580089 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-pvlwj_must-gather-rv9gg_26eea1ac-1be6-405e-a606-dadf213577a2/gather/0.log" Mar 21 04:16:15 crc kubenswrapper[4685]: I0321 04:16:15.301432 4685 scope.go:117] "RemoveContainer" containerID="d777c9ea0ed74e12c330732b76bb70566fa8eb4260f58ef17bc4882bafb2cbc6" Mar 21 04:16:15 crc kubenswrapper[4685]: E0321 04:16:15.302303 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7r9cg_openshift-machine-config-operator(cea46fe2-4e41-43ab-a069-cb30fb4e732c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" podUID="cea46fe2-4e41-43ab-a069-cb30fb4e732c" Mar 21 04:16:16 crc kubenswrapper[4685]: I0321 04:16:16.298304 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-pvlwj/must-gather-rv9gg"] Mar 21 04:16:16 crc kubenswrapper[4685]: I0321 04:16:16.298565 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-pvlwj/must-gather-rv9gg" podUID="26eea1ac-1be6-405e-a606-dadf213577a2" containerName="copy" containerID="cri-o://5716e822b3b1d2785bdd39e067f9486705adfdd1b2955898e487b8f5c1bcb811" gracePeriod=2 Mar 21 04:16:16 crc kubenswrapper[4685]: I0321 04:16:16.350255 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-pvlwj/must-gather-rv9gg"] Mar 21 04:16:16 crc kubenswrapper[4685]: I0321 04:16:16.515055 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-pvlwj_must-gather-rv9gg_26eea1ac-1be6-405e-a606-dadf213577a2/copy/0.log" Mar 21 04:16:16 crc kubenswrapper[4685]: I0321 04:16:16.516802 4685 generic.go:334] "Generic (PLEG): container finished" podID="26eea1ac-1be6-405e-a606-dadf213577a2" containerID="5716e822b3b1d2785bdd39e067f9486705adfdd1b2955898e487b8f5c1bcb811" exitCode=143 Mar 21 04:16:16 crc kubenswrapper[4685]: I0321 04:16:16.629106 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-pvlwj_must-gather-rv9gg_26eea1ac-1be6-405e-a606-dadf213577a2/copy/0.log" Mar 21 04:16:16 crc kubenswrapper[4685]: I0321 04:16:16.629427 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pvlwj/must-gather-rv9gg" Mar 21 04:16:16 crc kubenswrapper[4685]: I0321 04:16:16.680976 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5hv8z\" (UniqueName: \"kubernetes.io/projected/26eea1ac-1be6-405e-a606-dadf213577a2-kube-api-access-5hv8z\") pod \"26eea1ac-1be6-405e-a606-dadf213577a2\" (UID: \"26eea1ac-1be6-405e-a606-dadf213577a2\") " Mar 21 04:16:16 crc kubenswrapper[4685]: I0321 04:16:16.681079 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/26eea1ac-1be6-405e-a606-dadf213577a2-must-gather-output\") pod \"26eea1ac-1be6-405e-a606-dadf213577a2\" (UID: \"26eea1ac-1be6-405e-a606-dadf213577a2\") " Mar 21 04:16:16 crc kubenswrapper[4685]: I0321 04:16:16.686730 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26eea1ac-1be6-405e-a606-dadf213577a2-kube-api-access-5hv8z" (OuterVolumeSpecName: "kube-api-access-5hv8z") pod "26eea1ac-1be6-405e-a606-dadf213577a2" (UID: "26eea1ac-1be6-405e-a606-dadf213577a2"). InnerVolumeSpecName "kube-api-access-5hv8z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:16:16 crc kubenswrapper[4685]: I0321 04:16:16.761905 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26eea1ac-1be6-405e-a606-dadf213577a2-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "26eea1ac-1be6-405e-a606-dadf213577a2" (UID: "26eea1ac-1be6-405e-a606-dadf213577a2"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:16:16 crc kubenswrapper[4685]: I0321 04:16:16.782276 4685 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/26eea1ac-1be6-405e-a606-dadf213577a2-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 21 04:16:16 crc kubenswrapper[4685]: I0321 04:16:16.782323 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5hv8z\" (UniqueName: \"kubernetes.io/projected/26eea1ac-1be6-405e-a606-dadf213577a2-kube-api-access-5hv8z\") on node \"crc\" DevicePath \"\"" Mar 21 04:16:17 crc kubenswrapper[4685]: I0321 04:16:17.523213 4685 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-pvlwj_must-gather-rv9gg_26eea1ac-1be6-405e-a606-dadf213577a2/copy/0.log" Mar 21 04:16:17 crc kubenswrapper[4685]: I0321 04:16:17.523561 4685 scope.go:117] "RemoveContainer" containerID="5716e822b3b1d2785bdd39e067f9486705adfdd1b2955898e487b8f5c1bcb811" Mar 21 04:16:17 crc kubenswrapper[4685]: I0321 04:16:17.523656 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pvlwj/must-gather-rv9gg" Mar 21 04:16:17 crc kubenswrapper[4685]: I0321 04:16:17.540411 4685 scope.go:117] "RemoveContainer" containerID="c55bc7fca4f251b216c79e669b7e9caad787358e25c805dd7819ae55069f924c" Mar 21 04:16:18 crc kubenswrapper[4685]: I0321 04:16:18.307580 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26eea1ac-1be6-405e-a606-dadf213577a2" path="/var/lib/kubelet/pods/26eea1ac-1be6-405e-a606-dadf213577a2/volumes" Mar 21 04:16:26 crc kubenswrapper[4685]: I0321 04:16:26.300243 4685 scope.go:117] "RemoveContainer" containerID="d777c9ea0ed74e12c330732b76bb70566fa8eb4260f58ef17bc4882bafb2cbc6" Mar 21 04:16:26 crc kubenswrapper[4685]: E0321 04:16:26.300830 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7r9cg_openshift-machine-config-operator(cea46fe2-4e41-43ab-a069-cb30fb4e732c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" podUID="cea46fe2-4e41-43ab-a069-cb30fb4e732c" Mar 21 04:16:39 crc kubenswrapper[4685]: I0321 04:16:39.300714 4685 scope.go:117] "RemoveContainer" containerID="d777c9ea0ed74e12c330732b76bb70566fa8eb4260f58ef17bc4882bafb2cbc6" Mar 21 04:16:39 crc kubenswrapper[4685]: E0321 04:16:39.301695 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7r9cg_openshift-machine-config-operator(cea46fe2-4e41-43ab-a069-cb30fb4e732c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" podUID="cea46fe2-4e41-43ab-a069-cb30fb4e732c" Mar 21 04:16:40 crc kubenswrapper[4685]: I0321 04:16:40.015374 4685 scope.go:117] "RemoveContainer" containerID="3509c24ae14ab73179aa9701880d9ee9ca702785c2d16217cd4102275b1d6d17" Mar 21 04:16:52 crc kubenswrapper[4685]: I0321 04:16:52.300709 4685 scope.go:117] "RemoveContainer" containerID="d777c9ea0ed74e12c330732b76bb70566fa8eb4260f58ef17bc4882bafb2cbc6" Mar 21 04:16:52 crc kubenswrapper[4685]: E0321 04:16:52.301715 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7r9cg_openshift-machine-config-operator(cea46fe2-4e41-43ab-a069-cb30fb4e732c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" podUID="cea46fe2-4e41-43ab-a069-cb30fb4e732c" Mar 21 04:17:03 crc kubenswrapper[4685]: I0321 04:17:03.300820 4685 scope.go:117] "RemoveContainer" containerID="d777c9ea0ed74e12c330732b76bb70566fa8eb4260f58ef17bc4882bafb2cbc6" Mar 21 04:17:03 crc kubenswrapper[4685]: E0321 04:17:03.302419 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7r9cg_openshift-machine-config-operator(cea46fe2-4e41-43ab-a069-cb30fb4e732c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" podUID="cea46fe2-4e41-43ab-a069-cb30fb4e732c" Mar 21 04:17:14 crc kubenswrapper[4685]: I0321 04:17:14.303734 4685 scope.go:117] "RemoveContainer" containerID="d777c9ea0ed74e12c330732b76bb70566fa8eb4260f58ef17bc4882bafb2cbc6" Mar 21 04:17:14 crc kubenswrapper[4685]: E0321 04:17:14.304502 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7r9cg_openshift-machine-config-operator(cea46fe2-4e41-43ab-a069-cb30fb4e732c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" podUID="cea46fe2-4e41-43ab-a069-cb30fb4e732c" Mar 21 04:17:25 crc kubenswrapper[4685]: I0321 04:17:25.301274 4685 scope.go:117] "RemoveContainer" containerID="d777c9ea0ed74e12c330732b76bb70566fa8eb4260f58ef17bc4882bafb2cbc6" Mar 21 04:17:25 crc kubenswrapper[4685]: E0321 04:17:25.301998 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7r9cg_openshift-machine-config-operator(cea46fe2-4e41-43ab-a069-cb30fb4e732c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" podUID="cea46fe2-4e41-43ab-a069-cb30fb4e732c" Mar 21 04:17:36 crc kubenswrapper[4685]: I0321 04:17:36.301637 4685 scope.go:117] "RemoveContainer" containerID="d777c9ea0ed74e12c330732b76bb70566fa8eb4260f58ef17bc4882bafb2cbc6" Mar 21 04:17:36 crc kubenswrapper[4685]: E0321 04:17:36.302189 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7r9cg_openshift-machine-config-operator(cea46fe2-4e41-43ab-a069-cb30fb4e732c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" podUID="cea46fe2-4e41-43ab-a069-cb30fb4e732c" Mar 21 04:17:48 crc kubenswrapper[4685]: I0321 04:17:48.305152 4685 scope.go:117] "RemoveContainer" containerID="d777c9ea0ed74e12c330732b76bb70566fa8eb4260f58ef17bc4882bafb2cbc6" Mar 21 04:17:48 crc kubenswrapper[4685]: E0321 04:17:48.305660 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7r9cg_openshift-machine-config-operator(cea46fe2-4e41-43ab-a069-cb30fb4e732c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" podUID="cea46fe2-4e41-43ab-a069-cb30fb4e732c" Mar 21 04:18:00 crc kubenswrapper[4685]: I0321 04:18:00.138446 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567778-vwdcf"] Mar 21 04:18:00 crc kubenswrapper[4685]: E0321 04:18:00.138947 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="574a7e26-428c-4869-8643-3d08670b70e8" containerName="oc" Mar 21 04:18:00 crc kubenswrapper[4685]: I0321 04:18:00.138958 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="574a7e26-428c-4869-8643-3d08670b70e8" containerName="oc" Mar 21 04:18:00 crc kubenswrapper[4685]: E0321 04:18:00.138973 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26eea1ac-1be6-405e-a606-dadf213577a2" containerName="gather" Mar 21 04:18:00 crc kubenswrapper[4685]: I0321 04:18:00.138979 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="26eea1ac-1be6-405e-a606-dadf213577a2" containerName="gather" Mar 21 04:18:00 crc kubenswrapper[4685]: E0321 04:18:00.138989 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26eea1ac-1be6-405e-a606-dadf213577a2" containerName="copy" Mar 21 04:18:00 crc kubenswrapper[4685]: I0321 04:18:00.138995 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="26eea1ac-1be6-405e-a606-dadf213577a2" containerName="copy" Mar 21 04:18:00 crc kubenswrapper[4685]: I0321 04:18:00.139086 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="26eea1ac-1be6-405e-a606-dadf213577a2" containerName="gather" Mar 21 04:18:00 crc kubenswrapper[4685]: I0321 04:18:00.139098 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="26eea1ac-1be6-405e-a606-dadf213577a2" containerName="copy" Mar 21 04:18:00 crc kubenswrapper[4685]: I0321 04:18:00.139109 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="574a7e26-428c-4869-8643-3d08670b70e8" containerName="oc" Mar 21 04:18:00 crc kubenswrapper[4685]: I0321 04:18:00.139543 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567778-vwdcf" Mar 21 04:18:00 crc kubenswrapper[4685]: I0321 04:18:00.141763 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k75cc" Mar 21 04:18:00 crc kubenswrapper[4685]: I0321 04:18:00.142076 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 04:18:00 crc kubenswrapper[4685]: I0321 04:18:00.142598 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 04:18:00 crc kubenswrapper[4685]: I0321 04:18:00.150229 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567778-vwdcf"] Mar 21 04:18:00 crc kubenswrapper[4685]: I0321 04:18:00.197409 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d89wx\" (UniqueName: \"kubernetes.io/projected/0930d2b1-6148-41b7-bc0a-2d0be7059adf-kube-api-access-d89wx\") pod \"auto-csr-approver-29567778-vwdcf\" (UID: \"0930d2b1-6148-41b7-bc0a-2d0be7059adf\") " pod="openshift-infra/auto-csr-approver-29567778-vwdcf" Mar 21 04:18:00 crc kubenswrapper[4685]: I0321 04:18:00.298442 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d89wx\" (UniqueName: \"kubernetes.io/projected/0930d2b1-6148-41b7-bc0a-2d0be7059adf-kube-api-access-d89wx\") pod \"auto-csr-approver-29567778-vwdcf\" (UID: \"0930d2b1-6148-41b7-bc0a-2d0be7059adf\") " pod="openshift-infra/auto-csr-approver-29567778-vwdcf" Mar 21 04:18:00 crc kubenswrapper[4685]: I0321 04:18:00.300880 4685 scope.go:117] "RemoveContainer" containerID="d777c9ea0ed74e12c330732b76bb70566fa8eb4260f58ef17bc4882bafb2cbc6" Mar 21 04:18:00 crc kubenswrapper[4685]: E0321 04:18:00.301135 4685 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7r9cg_openshift-machine-config-operator(cea46fe2-4e41-43ab-a069-cb30fb4e732c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" podUID="cea46fe2-4e41-43ab-a069-cb30fb4e732c" Mar 21 04:18:00 crc kubenswrapper[4685]: I0321 04:18:00.324057 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d89wx\" (UniqueName: \"kubernetes.io/projected/0930d2b1-6148-41b7-bc0a-2d0be7059adf-kube-api-access-d89wx\") pod \"auto-csr-approver-29567778-vwdcf\" (UID: \"0930d2b1-6148-41b7-bc0a-2d0be7059adf\") " pod="openshift-infra/auto-csr-approver-29567778-vwdcf" Mar 21 04:18:00 crc kubenswrapper[4685]: I0321 04:18:00.480108 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567778-vwdcf" Mar 21 04:18:00 crc kubenswrapper[4685]: I0321 04:18:00.658135 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567778-vwdcf"] Mar 21 04:18:01 crc kubenswrapper[4685]: I0321 04:18:01.340857 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567778-vwdcf" event={"ID":"0930d2b1-6148-41b7-bc0a-2d0be7059adf","Type":"ContainerStarted","Data":"952d040ecad7122ecf53cee06022b5fd9646ffcef904ffd7ea1f50416df58b02"} Mar 21 04:18:03 crc kubenswrapper[4685]: I0321 04:18:03.356190 4685 generic.go:334] "Generic (PLEG): container finished" podID="0930d2b1-6148-41b7-bc0a-2d0be7059adf" containerID="f4ababf47aae0a7bf4711bdb14f0d05664cbf0b4c22876271865480060fe8abd" exitCode=0 Mar 21 04:18:03 crc kubenswrapper[4685]: I0321 04:18:03.356304 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567778-vwdcf" event={"ID":"0930d2b1-6148-41b7-bc0a-2d0be7059adf","Type":"ContainerDied","Data":"f4ababf47aae0a7bf4711bdb14f0d05664cbf0b4c22876271865480060fe8abd"} Mar 21 04:18:04 crc kubenswrapper[4685]: I0321 04:18:04.600938 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567778-vwdcf" Mar 21 04:18:04 crc kubenswrapper[4685]: I0321 04:18:04.653697 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d89wx\" (UniqueName: \"kubernetes.io/projected/0930d2b1-6148-41b7-bc0a-2d0be7059adf-kube-api-access-d89wx\") pod \"0930d2b1-6148-41b7-bc0a-2d0be7059adf\" (UID: \"0930d2b1-6148-41b7-bc0a-2d0be7059adf\") " Mar 21 04:18:04 crc kubenswrapper[4685]: I0321 04:18:04.659362 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0930d2b1-6148-41b7-bc0a-2d0be7059adf-kube-api-access-d89wx" (OuterVolumeSpecName: "kube-api-access-d89wx") pod "0930d2b1-6148-41b7-bc0a-2d0be7059adf" (UID: "0930d2b1-6148-41b7-bc0a-2d0be7059adf"). InnerVolumeSpecName "kube-api-access-d89wx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:18:04 crc kubenswrapper[4685]: I0321 04:18:04.755161 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d89wx\" (UniqueName: \"kubernetes.io/projected/0930d2b1-6148-41b7-bc0a-2d0be7059adf-kube-api-access-d89wx\") on node \"crc\" DevicePath \"\"" Mar 21 04:18:05 crc kubenswrapper[4685]: I0321 04:18:05.366938 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567778-vwdcf" event={"ID":"0930d2b1-6148-41b7-bc0a-2d0be7059adf","Type":"ContainerDied","Data":"952d040ecad7122ecf53cee06022b5fd9646ffcef904ffd7ea1f50416df58b02"} Mar 21 04:18:05 crc kubenswrapper[4685]: I0321 04:18:05.366985 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567778-vwdcf" Mar 21 04:18:05 crc kubenswrapper[4685]: I0321 04:18:05.366990 4685 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="952d040ecad7122ecf53cee06022b5fd9646ffcef904ffd7ea1f50416df58b02" Mar 21 04:18:05 crc kubenswrapper[4685]: I0321 04:18:05.674772 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567772-jtj9v"] Mar 21 04:18:05 crc kubenswrapper[4685]: I0321 04:18:05.679316 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567772-jtj9v"] Mar 21 04:18:06 crc kubenswrapper[4685]: I0321 04:18:06.310512 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d91ef770-3ed7-4de5-9707-53b8bcad00d0" path="/var/lib/kubelet/pods/d91ef770-3ed7-4de5-9707-53b8bcad00d0/volumes" Mar 21 04:18:15 crc kubenswrapper[4685]: I0321 04:18:15.301651 4685 scope.go:117] "RemoveContainer" containerID="d777c9ea0ed74e12c330732b76bb70566fa8eb4260f58ef17bc4882bafb2cbc6" Mar 21 04:18:16 crc kubenswrapper[4685]: I0321 04:18:16.437698 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" event={"ID":"cea46fe2-4e41-43ab-a069-cb30fb4e732c","Type":"ContainerStarted","Data":"051051488a19cbdd0f643f28c1f632ea030019cf5d7470aeea3670bdee12bc0d"} Mar 21 04:18:40 crc kubenswrapper[4685]: I0321 04:18:40.089459 4685 scope.go:117] "RemoveContainer" containerID="f59d28898c6e04272bb543ccb44c234b72d4d985db9747d660c85eabf5850b17" Mar 21 04:19:03 crc kubenswrapper[4685]: I0321 04:19:03.939969 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-p4mdz"] Mar 21 04:19:03 crc kubenswrapper[4685]: E0321 04:19:03.941827 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0930d2b1-6148-41b7-bc0a-2d0be7059adf" containerName="oc" Mar 21 04:19:03 crc kubenswrapper[4685]: I0321 04:19:03.942059 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="0930d2b1-6148-41b7-bc0a-2d0be7059adf" containerName="oc" Mar 21 04:19:03 crc kubenswrapper[4685]: I0321 04:19:03.942254 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="0930d2b1-6148-41b7-bc0a-2d0be7059adf" containerName="oc" Mar 21 04:19:03 crc kubenswrapper[4685]: I0321 04:19:03.943026 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p4mdz" Mar 21 04:19:03 crc kubenswrapper[4685]: I0321 04:19:03.950831 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p4mdz"] Mar 21 04:19:04 crc kubenswrapper[4685]: I0321 04:19:04.076986 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e151c308-91b1-4b90-b42a-91abf44f2416-catalog-content\") pod \"certified-operators-p4mdz\" (UID: \"e151c308-91b1-4b90-b42a-91abf44f2416\") " pod="openshift-marketplace/certified-operators-p4mdz" Mar 21 04:19:04 crc kubenswrapper[4685]: I0321 04:19:04.077051 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e151c308-91b1-4b90-b42a-91abf44f2416-utilities\") pod \"certified-operators-p4mdz\" (UID: \"e151c308-91b1-4b90-b42a-91abf44f2416\") " pod="openshift-marketplace/certified-operators-p4mdz" Mar 21 04:19:04 crc kubenswrapper[4685]: I0321 04:19:04.077095 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtshb\" (UniqueName: \"kubernetes.io/projected/e151c308-91b1-4b90-b42a-91abf44f2416-kube-api-access-dtshb\") pod \"certified-operators-p4mdz\" (UID: \"e151c308-91b1-4b90-b42a-91abf44f2416\") " pod="openshift-marketplace/certified-operators-p4mdz" Mar 21 04:19:04 crc kubenswrapper[4685]: I0321 04:19:04.177733 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtshb\" (UniqueName: \"kubernetes.io/projected/e151c308-91b1-4b90-b42a-91abf44f2416-kube-api-access-dtshb\") pod \"certified-operators-p4mdz\" (UID: \"e151c308-91b1-4b90-b42a-91abf44f2416\") " pod="openshift-marketplace/certified-operators-p4mdz" Mar 21 04:19:04 crc kubenswrapper[4685]: I0321 04:19:04.177799 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e151c308-91b1-4b90-b42a-91abf44f2416-catalog-content\") pod \"certified-operators-p4mdz\" (UID: \"e151c308-91b1-4b90-b42a-91abf44f2416\") " pod="openshift-marketplace/certified-operators-p4mdz" Mar 21 04:19:04 crc kubenswrapper[4685]: I0321 04:19:04.178307 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e151c308-91b1-4b90-b42a-91abf44f2416-catalog-content\") pod \"certified-operators-p4mdz\" (UID: \"e151c308-91b1-4b90-b42a-91abf44f2416\") " pod="openshift-marketplace/certified-operators-p4mdz" Mar 21 04:19:04 crc kubenswrapper[4685]: I0321 04:19:04.178388 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e151c308-91b1-4b90-b42a-91abf44f2416-utilities\") pod \"certified-operators-p4mdz\" (UID: \"e151c308-91b1-4b90-b42a-91abf44f2416\") " pod="openshift-marketplace/certified-operators-p4mdz" Mar 21 04:19:04 crc kubenswrapper[4685]: I0321 04:19:04.178600 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e151c308-91b1-4b90-b42a-91abf44f2416-utilities\") pod \"certified-operators-p4mdz\" (UID: \"e151c308-91b1-4b90-b42a-91abf44f2416\") " pod="openshift-marketplace/certified-operators-p4mdz" Mar 21 04:19:04 crc kubenswrapper[4685]: I0321 04:19:04.194797 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtshb\" (UniqueName: \"kubernetes.io/projected/e151c308-91b1-4b90-b42a-91abf44f2416-kube-api-access-dtshb\") pod \"certified-operators-p4mdz\" (UID: \"e151c308-91b1-4b90-b42a-91abf44f2416\") " pod="openshift-marketplace/certified-operators-p4mdz" Mar 21 04:19:04 crc kubenswrapper[4685]: I0321 04:19:04.262262 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p4mdz" Mar 21 04:19:04 crc kubenswrapper[4685]: I0321 04:19:04.598205 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p4mdz"] Mar 21 04:19:04 crc kubenswrapper[4685]: I0321 04:19:04.709090 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p4mdz" event={"ID":"e151c308-91b1-4b90-b42a-91abf44f2416","Type":"ContainerStarted","Data":"61c3232665daa04d07cbecf415a448377427f15463b2fb19ef3a612e0a55d486"} Mar 21 04:19:05 crc kubenswrapper[4685]: I0321 04:19:05.715857 4685 generic.go:334] "Generic (PLEG): container finished" podID="e151c308-91b1-4b90-b42a-91abf44f2416" containerID="eaf2b5d2c0b74bcfd9013c63d3820043ac6eecf0f7e8bb5fc740195a172b757b" exitCode=0 Mar 21 04:19:05 crc kubenswrapper[4685]: I0321 04:19:05.715927 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p4mdz" event={"ID":"e151c308-91b1-4b90-b42a-91abf44f2416","Type":"ContainerDied","Data":"eaf2b5d2c0b74bcfd9013c63d3820043ac6eecf0f7e8bb5fc740195a172b757b"} Mar 21 04:19:05 crc kubenswrapper[4685]: I0321 04:19:05.718042 4685 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 21 04:19:06 crc kubenswrapper[4685]: I0321 04:19:06.352742 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-sjdlc"] Mar 21 04:19:06 crc kubenswrapper[4685]: I0321 04:19:06.353981 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sjdlc" Mar 21 04:19:06 crc kubenswrapper[4685]: I0321 04:19:06.364306 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sjdlc"] Mar 21 04:19:06 crc kubenswrapper[4685]: I0321 04:19:06.405710 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d598c2f-ae26-436d-b72f-019fc5558ddc-catalog-content\") pod \"redhat-marketplace-sjdlc\" (UID: \"7d598c2f-ae26-436d-b72f-019fc5558ddc\") " pod="openshift-marketplace/redhat-marketplace-sjdlc" Mar 21 04:19:06 crc kubenswrapper[4685]: I0321 04:19:06.405777 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2k6zt\" (UniqueName: \"kubernetes.io/projected/7d598c2f-ae26-436d-b72f-019fc5558ddc-kube-api-access-2k6zt\") pod \"redhat-marketplace-sjdlc\" (UID: \"7d598c2f-ae26-436d-b72f-019fc5558ddc\") " pod="openshift-marketplace/redhat-marketplace-sjdlc" Mar 21 04:19:06 crc kubenswrapper[4685]: I0321 04:19:06.405801 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d598c2f-ae26-436d-b72f-019fc5558ddc-utilities\") pod \"redhat-marketplace-sjdlc\" (UID: \"7d598c2f-ae26-436d-b72f-019fc5558ddc\") " pod="openshift-marketplace/redhat-marketplace-sjdlc" Mar 21 04:19:06 crc kubenswrapper[4685]: I0321 04:19:06.507309 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2k6zt\" (UniqueName: \"kubernetes.io/projected/7d598c2f-ae26-436d-b72f-019fc5558ddc-kube-api-access-2k6zt\") pod \"redhat-marketplace-sjdlc\" (UID: \"7d598c2f-ae26-436d-b72f-019fc5558ddc\") " pod="openshift-marketplace/redhat-marketplace-sjdlc" Mar 21 04:19:06 crc kubenswrapper[4685]: I0321 04:19:06.507530 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d598c2f-ae26-436d-b72f-019fc5558ddc-utilities\") pod \"redhat-marketplace-sjdlc\" (UID: \"7d598c2f-ae26-436d-b72f-019fc5558ddc\") " pod="openshift-marketplace/redhat-marketplace-sjdlc" Mar 21 04:19:06 crc kubenswrapper[4685]: I0321 04:19:06.507663 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d598c2f-ae26-436d-b72f-019fc5558ddc-catalog-content\") pod \"redhat-marketplace-sjdlc\" (UID: \"7d598c2f-ae26-436d-b72f-019fc5558ddc\") " pod="openshift-marketplace/redhat-marketplace-sjdlc" Mar 21 04:19:06 crc kubenswrapper[4685]: I0321 04:19:06.508061 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d598c2f-ae26-436d-b72f-019fc5558ddc-catalog-content\") pod \"redhat-marketplace-sjdlc\" (UID: \"7d598c2f-ae26-436d-b72f-019fc5558ddc\") " pod="openshift-marketplace/redhat-marketplace-sjdlc" Mar 21 04:19:06 crc kubenswrapper[4685]: I0321 04:19:06.508116 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d598c2f-ae26-436d-b72f-019fc5558ddc-utilities\") pod \"redhat-marketplace-sjdlc\" (UID: \"7d598c2f-ae26-436d-b72f-019fc5558ddc\") " pod="openshift-marketplace/redhat-marketplace-sjdlc" Mar 21 04:19:06 crc kubenswrapper[4685]: I0321 04:19:06.524912 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2k6zt\" (UniqueName: \"kubernetes.io/projected/7d598c2f-ae26-436d-b72f-019fc5558ddc-kube-api-access-2k6zt\") pod \"redhat-marketplace-sjdlc\" (UID: \"7d598c2f-ae26-436d-b72f-019fc5558ddc\") " pod="openshift-marketplace/redhat-marketplace-sjdlc" Mar 21 04:19:06 crc kubenswrapper[4685]: I0321 04:19:06.673450 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sjdlc" Mar 21 04:19:06 crc kubenswrapper[4685]: I0321 04:19:06.959698 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sjdlc"] Mar 21 04:19:06 crc kubenswrapper[4685]: W0321 04:19:06.965609 4685 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d598c2f_ae26_436d_b72f_019fc5558ddc.slice/crio-f31c09fa77258a8191eb1f3cc86f5db214fdc6485f1cd5a72d3411a026399d51 WatchSource:0}: Error finding container f31c09fa77258a8191eb1f3cc86f5db214fdc6485f1cd5a72d3411a026399d51: Status 404 returned error can't find the container with id f31c09fa77258a8191eb1f3cc86f5db214fdc6485f1cd5a72d3411a026399d51 Mar 21 04:19:07 crc kubenswrapper[4685]: I0321 04:19:07.742468 4685 generic.go:334] "Generic (PLEG): container finished" podID="e151c308-91b1-4b90-b42a-91abf44f2416" containerID="c5fede3ccceefff6cd30c1c623c586b0192ef623919bca21a9d78c93730485cc" exitCode=0 Mar 21 04:19:07 crc kubenswrapper[4685]: I0321 04:19:07.742545 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p4mdz" event={"ID":"e151c308-91b1-4b90-b42a-91abf44f2416","Type":"ContainerDied","Data":"c5fede3ccceefff6cd30c1c623c586b0192ef623919bca21a9d78c93730485cc"} Mar 21 04:19:07 crc kubenswrapper[4685]: I0321 04:19:07.743764 4685 generic.go:334] "Generic (PLEG): container finished" podID="7d598c2f-ae26-436d-b72f-019fc5558ddc" containerID="0ba63b6427d00d32ac8392dc941e586fc4e6c6d7f88f0ef58efd77f17b367b3d" exitCode=0 Mar 21 04:19:07 crc kubenswrapper[4685]: I0321 04:19:07.743788 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sjdlc" event={"ID":"7d598c2f-ae26-436d-b72f-019fc5558ddc","Type":"ContainerDied","Data":"0ba63b6427d00d32ac8392dc941e586fc4e6c6d7f88f0ef58efd77f17b367b3d"} Mar 21 04:19:07 crc kubenswrapper[4685]: I0321 04:19:07.743803 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sjdlc" event={"ID":"7d598c2f-ae26-436d-b72f-019fc5558ddc","Type":"ContainerStarted","Data":"f31c09fa77258a8191eb1f3cc86f5db214fdc6485f1cd5a72d3411a026399d51"} Mar 21 04:19:09 crc kubenswrapper[4685]: I0321 04:19:09.756345 4685 generic.go:334] "Generic (PLEG): container finished" podID="7d598c2f-ae26-436d-b72f-019fc5558ddc" containerID="a8cd3a9a091203257a5394574d41b6c97a1733c5c796aee7d5ef2ead2e498b8d" exitCode=0 Mar 21 04:19:09 crc kubenswrapper[4685]: I0321 04:19:09.756504 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sjdlc" event={"ID":"7d598c2f-ae26-436d-b72f-019fc5558ddc","Type":"ContainerDied","Data":"a8cd3a9a091203257a5394574d41b6c97a1733c5c796aee7d5ef2ead2e498b8d"} Mar 21 04:19:09 crc kubenswrapper[4685]: I0321 04:19:09.762934 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p4mdz" event={"ID":"e151c308-91b1-4b90-b42a-91abf44f2416","Type":"ContainerStarted","Data":"d9d15e59c002e646a666fe5e418281e5a6dce68d8e7087378681d41f9012ea60"} Mar 21 04:19:09 crc kubenswrapper[4685]: I0321 04:19:09.799425 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-p4mdz" podStartSLOduration=4.028437538 podStartE2EDuration="6.799405142s" podCreationTimestamp="2026-03-21 04:19:03 +0000 UTC" firstStartedPulling="2026-03-21 04:19:05.717799228 +0000 UTC m=+1978.194868020" lastFinishedPulling="2026-03-21 04:19:08.488766832 +0000 UTC m=+1980.965835624" observedRunningTime="2026-03-21 04:19:09.798480306 +0000 UTC m=+1982.275549098" watchObservedRunningTime="2026-03-21 04:19:09.799405142 +0000 UTC m=+1982.276473934" Mar 21 04:19:10 crc kubenswrapper[4685]: I0321 04:19:10.771213 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sjdlc" event={"ID":"7d598c2f-ae26-436d-b72f-019fc5558ddc","Type":"ContainerStarted","Data":"dc9185604b7e4bce3ed1f7996fe9b9519aacb1f1365ed4bbd64f406b801b1454"} Mar 21 04:19:10 crc kubenswrapper[4685]: I0321 04:19:10.788386 4685 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-sjdlc" podStartSLOduration=1.988552466 podStartE2EDuration="4.788368279s" podCreationTimestamp="2026-03-21 04:19:06 +0000 UTC" firstStartedPulling="2026-03-21 04:19:07.747047977 +0000 UTC m=+1980.224116809" lastFinishedPulling="2026-03-21 04:19:10.54686382 +0000 UTC m=+1983.023932622" observedRunningTime="2026-03-21 04:19:10.787773912 +0000 UTC m=+1983.264842724" watchObservedRunningTime="2026-03-21 04:19:10.788368279 +0000 UTC m=+1983.265437071" Mar 21 04:19:14 crc kubenswrapper[4685]: I0321 04:19:14.262878 4685 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-p4mdz" Mar 21 04:19:14 crc kubenswrapper[4685]: I0321 04:19:14.262939 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-p4mdz" Mar 21 04:19:14 crc kubenswrapper[4685]: I0321 04:19:14.298229 4685 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-p4mdz" Mar 21 04:19:15 crc kubenswrapper[4685]: I0321 04:19:15.264096 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-p4mdz" Mar 21 04:19:16 crc kubenswrapper[4685]: I0321 04:19:16.330100 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-p4mdz"] Mar 21 04:19:16 crc kubenswrapper[4685]: I0321 04:19:16.674458 4685 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-sjdlc" Mar 21 04:19:16 crc kubenswrapper[4685]: I0321 04:19:16.674556 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-sjdlc" Mar 21 04:19:16 crc kubenswrapper[4685]: I0321 04:19:16.711243 4685 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-sjdlc" Mar 21 04:19:17 crc kubenswrapper[4685]: I0321 04:19:17.238127 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-p4mdz" podUID="e151c308-91b1-4b90-b42a-91abf44f2416" containerName="registry-server" containerID="cri-o://d9d15e59c002e646a666fe5e418281e5a6dce68d8e7087378681d41f9012ea60" gracePeriod=2 Mar 21 04:19:17 crc kubenswrapper[4685]: I0321 04:19:17.281781 4685 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-sjdlc" Mar 21 04:19:18 crc kubenswrapper[4685]: I0321 04:19:18.174654 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p4mdz" Mar 21 04:19:18 crc kubenswrapper[4685]: I0321 04:19:18.245544 4685 generic.go:334] "Generic (PLEG): container finished" podID="e151c308-91b1-4b90-b42a-91abf44f2416" containerID="d9d15e59c002e646a666fe5e418281e5a6dce68d8e7087378681d41f9012ea60" exitCode=0 Mar 21 04:19:18 crc kubenswrapper[4685]: I0321 04:19:18.246314 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p4mdz" Mar 21 04:19:18 crc kubenswrapper[4685]: I0321 04:19:18.246662 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p4mdz" event={"ID":"e151c308-91b1-4b90-b42a-91abf44f2416","Type":"ContainerDied","Data":"d9d15e59c002e646a666fe5e418281e5a6dce68d8e7087378681d41f9012ea60"} Mar 21 04:19:18 crc kubenswrapper[4685]: I0321 04:19:18.246693 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p4mdz" event={"ID":"e151c308-91b1-4b90-b42a-91abf44f2416","Type":"ContainerDied","Data":"61c3232665daa04d07cbecf415a448377427f15463b2fb19ef3a612e0a55d486"} Mar 21 04:19:18 crc kubenswrapper[4685]: I0321 04:19:18.246709 4685 scope.go:117] "RemoveContainer" containerID="d9d15e59c002e646a666fe5e418281e5a6dce68d8e7087378681d41f9012ea60" Mar 21 04:19:18 crc kubenswrapper[4685]: I0321 04:19:18.263335 4685 scope.go:117] "RemoveContainer" containerID="c5fede3ccceefff6cd30c1c623c586b0192ef623919bca21a9d78c93730485cc" Mar 21 04:19:18 crc kubenswrapper[4685]: I0321 04:19:18.266430 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dtshb\" (UniqueName: \"kubernetes.io/projected/e151c308-91b1-4b90-b42a-91abf44f2416-kube-api-access-dtshb\") pod \"e151c308-91b1-4b90-b42a-91abf44f2416\" (UID: \"e151c308-91b1-4b90-b42a-91abf44f2416\") " Mar 21 04:19:18 crc kubenswrapper[4685]: I0321 04:19:18.266496 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e151c308-91b1-4b90-b42a-91abf44f2416-catalog-content\") pod \"e151c308-91b1-4b90-b42a-91abf44f2416\" (UID: \"e151c308-91b1-4b90-b42a-91abf44f2416\") " Mar 21 04:19:18 crc kubenswrapper[4685]: I0321 04:19:18.266572 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e151c308-91b1-4b90-b42a-91abf44f2416-utilities\") pod \"e151c308-91b1-4b90-b42a-91abf44f2416\" (UID: \"e151c308-91b1-4b90-b42a-91abf44f2416\") " Mar 21 04:19:18 crc kubenswrapper[4685]: I0321 04:19:18.268721 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e151c308-91b1-4b90-b42a-91abf44f2416-utilities" (OuterVolumeSpecName: "utilities") pod "e151c308-91b1-4b90-b42a-91abf44f2416" (UID: "e151c308-91b1-4b90-b42a-91abf44f2416"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:19:18 crc kubenswrapper[4685]: I0321 04:19:18.289549 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e151c308-91b1-4b90-b42a-91abf44f2416-kube-api-access-dtshb" (OuterVolumeSpecName: "kube-api-access-dtshb") pod "e151c308-91b1-4b90-b42a-91abf44f2416" (UID: "e151c308-91b1-4b90-b42a-91abf44f2416"). InnerVolumeSpecName "kube-api-access-dtshb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:19:18 crc kubenswrapper[4685]: I0321 04:19:18.303168 4685 scope.go:117] "RemoveContainer" containerID="eaf2b5d2c0b74bcfd9013c63d3820043ac6eecf0f7e8bb5fc740195a172b757b" Mar 21 04:19:18 crc kubenswrapper[4685]: I0321 04:19:18.322649 4685 scope.go:117] "RemoveContainer" containerID="d9d15e59c002e646a666fe5e418281e5a6dce68d8e7087378681d41f9012ea60" Mar 21 04:19:18 crc kubenswrapper[4685]: E0321 04:19:18.323216 4685 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9d15e59c002e646a666fe5e418281e5a6dce68d8e7087378681d41f9012ea60\": container with ID starting with d9d15e59c002e646a666fe5e418281e5a6dce68d8e7087378681d41f9012ea60 not found: ID does not exist" containerID="d9d15e59c002e646a666fe5e418281e5a6dce68d8e7087378681d41f9012ea60" Mar 21 04:19:18 crc kubenswrapper[4685]: I0321 04:19:18.323273 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9d15e59c002e646a666fe5e418281e5a6dce68d8e7087378681d41f9012ea60"} err="failed to get container status \"d9d15e59c002e646a666fe5e418281e5a6dce68d8e7087378681d41f9012ea60\": rpc error: code = NotFound desc = could not find container \"d9d15e59c002e646a666fe5e418281e5a6dce68d8e7087378681d41f9012ea60\": container with ID starting with d9d15e59c002e646a666fe5e418281e5a6dce68d8e7087378681d41f9012ea60 not found: ID does not exist" Mar 21 04:19:18 crc kubenswrapper[4685]: I0321 04:19:18.323302 4685 scope.go:117] "RemoveContainer" containerID="c5fede3ccceefff6cd30c1c623c586b0192ef623919bca21a9d78c93730485cc" Mar 21 04:19:18 crc kubenswrapper[4685]: E0321 04:19:18.324038 4685 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5fede3ccceefff6cd30c1c623c586b0192ef623919bca21a9d78c93730485cc\": container with ID starting with c5fede3ccceefff6cd30c1c623c586b0192ef623919bca21a9d78c93730485cc not found: ID does not exist" containerID="c5fede3ccceefff6cd30c1c623c586b0192ef623919bca21a9d78c93730485cc" Mar 21 04:19:18 crc kubenswrapper[4685]: I0321 04:19:18.324065 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5fede3ccceefff6cd30c1c623c586b0192ef623919bca21a9d78c93730485cc"} err="failed to get container status \"c5fede3ccceefff6cd30c1c623c586b0192ef623919bca21a9d78c93730485cc\": rpc error: code = NotFound desc = could not find container \"c5fede3ccceefff6cd30c1c623c586b0192ef623919bca21a9d78c93730485cc\": container with ID starting with c5fede3ccceefff6cd30c1c623c586b0192ef623919bca21a9d78c93730485cc not found: ID does not exist" Mar 21 04:19:18 crc kubenswrapper[4685]: I0321 04:19:18.324081 4685 scope.go:117] "RemoveContainer" containerID="eaf2b5d2c0b74bcfd9013c63d3820043ac6eecf0f7e8bb5fc740195a172b757b" Mar 21 04:19:18 crc kubenswrapper[4685]: E0321 04:19:18.324400 4685 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eaf2b5d2c0b74bcfd9013c63d3820043ac6eecf0f7e8bb5fc740195a172b757b\": container with ID starting with eaf2b5d2c0b74bcfd9013c63d3820043ac6eecf0f7e8bb5fc740195a172b757b not found: ID does not exist" containerID="eaf2b5d2c0b74bcfd9013c63d3820043ac6eecf0f7e8bb5fc740195a172b757b" Mar 21 04:19:18 crc kubenswrapper[4685]: I0321 04:19:18.324442 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eaf2b5d2c0b74bcfd9013c63d3820043ac6eecf0f7e8bb5fc740195a172b757b"} err="failed to get container status \"eaf2b5d2c0b74bcfd9013c63d3820043ac6eecf0f7e8bb5fc740195a172b757b\": rpc error: code = NotFound desc = could not find container \"eaf2b5d2c0b74bcfd9013c63d3820043ac6eecf0f7e8bb5fc740195a172b757b\": container with ID starting with eaf2b5d2c0b74bcfd9013c63d3820043ac6eecf0f7e8bb5fc740195a172b757b not found: ID does not exist" Mar 21 04:19:18 crc kubenswrapper[4685]: I0321 04:19:18.333821 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e151c308-91b1-4b90-b42a-91abf44f2416-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e151c308-91b1-4b90-b42a-91abf44f2416" (UID: "e151c308-91b1-4b90-b42a-91abf44f2416"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:19:18 crc kubenswrapper[4685]: I0321 04:19:18.367571 4685 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e151c308-91b1-4b90-b42a-91abf44f2416-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 04:19:18 crc kubenswrapper[4685]: I0321 04:19:18.367624 4685 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e151c308-91b1-4b90-b42a-91abf44f2416-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 04:19:18 crc kubenswrapper[4685]: I0321 04:19:18.367639 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dtshb\" (UniqueName: \"kubernetes.io/projected/e151c308-91b1-4b90-b42a-91abf44f2416-kube-api-access-dtshb\") on node \"crc\" DevicePath \"\"" Mar 21 04:19:18 crc kubenswrapper[4685]: I0321 04:19:18.576796 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-p4mdz"] Mar 21 04:19:18 crc kubenswrapper[4685]: I0321 04:19:18.580662 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-p4mdz"] Mar 21 04:19:19 crc kubenswrapper[4685]: I0321 04:19:19.134295 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sjdlc"] Mar 21 04:19:19 crc kubenswrapper[4685]: I0321 04:19:19.251515 4685 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-sjdlc" podUID="7d598c2f-ae26-436d-b72f-019fc5558ddc" containerName="registry-server" containerID="cri-o://dc9185604b7e4bce3ed1f7996fe9b9519aacb1f1365ed4bbd64f406b801b1454" gracePeriod=2 Mar 21 04:19:19 crc kubenswrapper[4685]: I0321 04:19:19.565510 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sjdlc" Mar 21 04:19:19 crc kubenswrapper[4685]: I0321 04:19:19.683768 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2k6zt\" (UniqueName: \"kubernetes.io/projected/7d598c2f-ae26-436d-b72f-019fc5558ddc-kube-api-access-2k6zt\") pod \"7d598c2f-ae26-436d-b72f-019fc5558ddc\" (UID: \"7d598c2f-ae26-436d-b72f-019fc5558ddc\") " Mar 21 04:19:19 crc kubenswrapper[4685]: I0321 04:19:19.683845 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d598c2f-ae26-436d-b72f-019fc5558ddc-catalog-content\") pod \"7d598c2f-ae26-436d-b72f-019fc5558ddc\" (UID: \"7d598c2f-ae26-436d-b72f-019fc5558ddc\") " Mar 21 04:19:19 crc kubenswrapper[4685]: I0321 04:19:19.683877 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d598c2f-ae26-436d-b72f-019fc5558ddc-utilities\") pod \"7d598c2f-ae26-436d-b72f-019fc5558ddc\" (UID: \"7d598c2f-ae26-436d-b72f-019fc5558ddc\") " Mar 21 04:19:19 crc kubenswrapper[4685]: I0321 04:19:19.684822 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d598c2f-ae26-436d-b72f-019fc5558ddc-utilities" (OuterVolumeSpecName: "utilities") pod "7d598c2f-ae26-436d-b72f-019fc5558ddc" (UID: "7d598c2f-ae26-436d-b72f-019fc5558ddc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:19:19 crc kubenswrapper[4685]: I0321 04:19:19.688316 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d598c2f-ae26-436d-b72f-019fc5558ddc-kube-api-access-2k6zt" (OuterVolumeSpecName: "kube-api-access-2k6zt") pod "7d598c2f-ae26-436d-b72f-019fc5558ddc" (UID: "7d598c2f-ae26-436d-b72f-019fc5558ddc"). InnerVolumeSpecName "kube-api-access-2k6zt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:19:19 crc kubenswrapper[4685]: I0321 04:19:19.716568 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d598c2f-ae26-436d-b72f-019fc5558ddc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7d598c2f-ae26-436d-b72f-019fc5558ddc" (UID: "7d598c2f-ae26-436d-b72f-019fc5558ddc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:19:19 crc kubenswrapper[4685]: I0321 04:19:19.785646 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2k6zt\" (UniqueName: \"kubernetes.io/projected/7d598c2f-ae26-436d-b72f-019fc5558ddc-kube-api-access-2k6zt\") on node \"crc\" DevicePath \"\"" Mar 21 04:19:19 crc kubenswrapper[4685]: I0321 04:19:19.785681 4685 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d598c2f-ae26-436d-b72f-019fc5558ddc-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 04:19:19 crc kubenswrapper[4685]: I0321 04:19:19.785690 4685 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d598c2f-ae26-436d-b72f-019fc5558ddc-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 04:19:20 crc kubenswrapper[4685]: I0321 04:19:20.257903 4685 generic.go:334] "Generic (PLEG): container finished" podID="7d598c2f-ae26-436d-b72f-019fc5558ddc" containerID="dc9185604b7e4bce3ed1f7996fe9b9519aacb1f1365ed4bbd64f406b801b1454" exitCode=0 Mar 21 04:19:20 crc kubenswrapper[4685]: I0321 04:19:20.257963 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sjdlc" Mar 21 04:19:20 crc kubenswrapper[4685]: I0321 04:19:20.258000 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sjdlc" event={"ID":"7d598c2f-ae26-436d-b72f-019fc5558ddc","Type":"ContainerDied","Data":"dc9185604b7e4bce3ed1f7996fe9b9519aacb1f1365ed4bbd64f406b801b1454"} Mar 21 04:19:20 crc kubenswrapper[4685]: I0321 04:19:20.258348 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sjdlc" event={"ID":"7d598c2f-ae26-436d-b72f-019fc5558ddc","Type":"ContainerDied","Data":"f31c09fa77258a8191eb1f3cc86f5db214fdc6485f1cd5a72d3411a026399d51"} Mar 21 04:19:20 crc kubenswrapper[4685]: I0321 04:19:20.258628 4685 scope.go:117] "RemoveContainer" containerID="dc9185604b7e4bce3ed1f7996fe9b9519aacb1f1365ed4bbd64f406b801b1454" Mar 21 04:19:20 crc kubenswrapper[4685]: I0321 04:19:20.276799 4685 scope.go:117] "RemoveContainer" containerID="a8cd3a9a091203257a5394574d41b6c97a1733c5c796aee7d5ef2ead2e498b8d" Mar 21 04:19:20 crc kubenswrapper[4685]: I0321 04:19:20.310775 4685 scope.go:117] "RemoveContainer" containerID="0ba63b6427d00d32ac8392dc941e586fc4e6c6d7f88f0ef58efd77f17b367b3d" Mar 21 04:19:20 crc kubenswrapper[4685]: I0321 04:19:20.313348 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e151c308-91b1-4b90-b42a-91abf44f2416" path="/var/lib/kubelet/pods/e151c308-91b1-4b90-b42a-91abf44f2416/volumes" Mar 21 04:19:20 crc kubenswrapper[4685]: I0321 04:19:20.313860 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sjdlc"] Mar 21 04:19:20 crc kubenswrapper[4685]: I0321 04:19:20.313881 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-sjdlc"] Mar 21 04:19:20 crc kubenswrapper[4685]: I0321 04:19:20.324913 4685 scope.go:117] "RemoveContainer" containerID="dc9185604b7e4bce3ed1f7996fe9b9519aacb1f1365ed4bbd64f406b801b1454" Mar 21 04:19:20 crc kubenswrapper[4685]: E0321 04:19:20.325280 4685 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc9185604b7e4bce3ed1f7996fe9b9519aacb1f1365ed4bbd64f406b801b1454\": container with ID starting with dc9185604b7e4bce3ed1f7996fe9b9519aacb1f1365ed4bbd64f406b801b1454 not found: ID does not exist" containerID="dc9185604b7e4bce3ed1f7996fe9b9519aacb1f1365ed4bbd64f406b801b1454" Mar 21 04:19:20 crc kubenswrapper[4685]: I0321 04:19:20.325353 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc9185604b7e4bce3ed1f7996fe9b9519aacb1f1365ed4bbd64f406b801b1454"} err="failed to get container status \"dc9185604b7e4bce3ed1f7996fe9b9519aacb1f1365ed4bbd64f406b801b1454\": rpc error: code = NotFound desc = could not find container \"dc9185604b7e4bce3ed1f7996fe9b9519aacb1f1365ed4bbd64f406b801b1454\": container with ID starting with dc9185604b7e4bce3ed1f7996fe9b9519aacb1f1365ed4bbd64f406b801b1454 not found: ID does not exist" Mar 21 04:19:20 crc kubenswrapper[4685]: I0321 04:19:20.325372 4685 scope.go:117] "RemoveContainer" containerID="a8cd3a9a091203257a5394574d41b6c97a1733c5c796aee7d5ef2ead2e498b8d" Mar 21 04:19:20 crc kubenswrapper[4685]: E0321 04:19:20.325701 4685 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8cd3a9a091203257a5394574d41b6c97a1733c5c796aee7d5ef2ead2e498b8d\": container with ID starting with a8cd3a9a091203257a5394574d41b6c97a1733c5c796aee7d5ef2ead2e498b8d not found: ID does not exist" containerID="a8cd3a9a091203257a5394574d41b6c97a1733c5c796aee7d5ef2ead2e498b8d" Mar 21 04:19:20 crc kubenswrapper[4685]: I0321 04:19:20.325723 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8cd3a9a091203257a5394574d41b6c97a1733c5c796aee7d5ef2ead2e498b8d"} err="failed to get container status \"a8cd3a9a091203257a5394574d41b6c97a1733c5c796aee7d5ef2ead2e498b8d\": rpc error: code = NotFound desc = could not find container \"a8cd3a9a091203257a5394574d41b6c97a1733c5c796aee7d5ef2ead2e498b8d\": container with ID starting with a8cd3a9a091203257a5394574d41b6c97a1733c5c796aee7d5ef2ead2e498b8d not found: ID does not exist" Mar 21 04:19:20 crc kubenswrapper[4685]: I0321 04:19:20.325738 4685 scope.go:117] "RemoveContainer" containerID="0ba63b6427d00d32ac8392dc941e586fc4e6c6d7f88f0ef58efd77f17b367b3d" Mar 21 04:19:20 crc kubenswrapper[4685]: E0321 04:19:20.325978 4685 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ba63b6427d00d32ac8392dc941e586fc4e6c6d7f88f0ef58efd77f17b367b3d\": container with ID starting with 0ba63b6427d00d32ac8392dc941e586fc4e6c6d7f88f0ef58efd77f17b367b3d not found: ID does not exist" containerID="0ba63b6427d00d32ac8392dc941e586fc4e6c6d7f88f0ef58efd77f17b367b3d" Mar 21 04:19:20 crc kubenswrapper[4685]: I0321 04:19:20.325997 4685 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ba63b6427d00d32ac8392dc941e586fc4e6c6d7f88f0ef58efd77f17b367b3d"} err="failed to get container status \"0ba63b6427d00d32ac8392dc941e586fc4e6c6d7f88f0ef58efd77f17b367b3d\": rpc error: code = NotFound desc = could not find container \"0ba63b6427d00d32ac8392dc941e586fc4e6c6d7f88f0ef58efd77f17b367b3d\": container with ID starting with 0ba63b6427d00d32ac8392dc941e586fc4e6c6d7f88f0ef58efd77f17b367b3d not found: ID does not exist" Mar 21 04:19:22 crc kubenswrapper[4685]: I0321 04:19:22.306480 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d598c2f-ae26-436d-b72f-019fc5558ddc" path="/var/lib/kubelet/pods/7d598c2f-ae26-436d-b72f-019fc5558ddc/volumes" Mar 21 04:20:00 crc kubenswrapper[4685]: I0321 04:20:00.135116 4685 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567780-vwjxn"] Mar 21 04:20:00 crc kubenswrapper[4685]: E0321 04:20:00.136559 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d598c2f-ae26-436d-b72f-019fc5558ddc" containerName="extract-content" Mar 21 04:20:00 crc kubenswrapper[4685]: I0321 04:20:00.136586 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d598c2f-ae26-436d-b72f-019fc5558ddc" containerName="extract-content" Mar 21 04:20:00 crc kubenswrapper[4685]: E0321 04:20:00.136613 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e151c308-91b1-4b90-b42a-91abf44f2416" containerName="registry-server" Mar 21 04:20:00 crc kubenswrapper[4685]: I0321 04:20:00.136627 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="e151c308-91b1-4b90-b42a-91abf44f2416" containerName="registry-server" Mar 21 04:20:00 crc kubenswrapper[4685]: E0321 04:20:00.136658 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e151c308-91b1-4b90-b42a-91abf44f2416" containerName="extract-content" Mar 21 04:20:00 crc kubenswrapper[4685]: I0321 04:20:00.136671 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="e151c308-91b1-4b90-b42a-91abf44f2416" containerName="extract-content" Mar 21 04:20:00 crc kubenswrapper[4685]: E0321 04:20:00.136704 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d598c2f-ae26-436d-b72f-019fc5558ddc" containerName="extract-utilities" Mar 21 04:20:00 crc kubenswrapper[4685]: I0321 04:20:00.136720 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d598c2f-ae26-436d-b72f-019fc5558ddc" containerName="extract-utilities" Mar 21 04:20:00 crc kubenswrapper[4685]: E0321 04:20:00.136741 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e151c308-91b1-4b90-b42a-91abf44f2416" containerName="extract-utilities" Mar 21 04:20:00 crc kubenswrapper[4685]: I0321 04:20:00.136755 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="e151c308-91b1-4b90-b42a-91abf44f2416" containerName="extract-utilities" Mar 21 04:20:00 crc kubenswrapper[4685]: E0321 04:20:00.136771 4685 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d598c2f-ae26-436d-b72f-019fc5558ddc" containerName="registry-server" Mar 21 04:20:00 crc kubenswrapper[4685]: I0321 04:20:00.136785 4685 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d598c2f-ae26-436d-b72f-019fc5558ddc" containerName="registry-server" Mar 21 04:20:00 crc kubenswrapper[4685]: I0321 04:20:00.137011 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d598c2f-ae26-436d-b72f-019fc5558ddc" containerName="registry-server" Mar 21 04:20:00 crc kubenswrapper[4685]: I0321 04:20:00.137045 4685 memory_manager.go:354] "RemoveStaleState removing state" podUID="e151c308-91b1-4b90-b42a-91abf44f2416" containerName="registry-server" Mar 21 04:20:00 crc kubenswrapper[4685]: I0321 04:20:00.137792 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567780-vwjxn" Mar 21 04:20:00 crc kubenswrapper[4685]: I0321 04:20:00.141856 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 04:20:00 crc kubenswrapper[4685]: I0321 04:20:00.141936 4685 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 04:20:00 crc kubenswrapper[4685]: I0321 04:20:00.142229 4685 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k75cc" Mar 21 04:20:00 crc kubenswrapper[4685]: I0321 04:20:00.144759 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567780-vwjxn"] Mar 21 04:20:00 crc kubenswrapper[4685]: I0321 04:20:00.230114 4685 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffbv8\" (UniqueName: \"kubernetes.io/projected/263460d6-6037-4b89-a8d1-bc862dac0048-kube-api-access-ffbv8\") pod \"auto-csr-approver-29567780-vwjxn\" (UID: \"263460d6-6037-4b89-a8d1-bc862dac0048\") " pod="openshift-infra/auto-csr-approver-29567780-vwjxn" Mar 21 04:20:00 crc kubenswrapper[4685]: I0321 04:20:00.331339 4685 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffbv8\" (UniqueName: \"kubernetes.io/projected/263460d6-6037-4b89-a8d1-bc862dac0048-kube-api-access-ffbv8\") pod \"auto-csr-approver-29567780-vwjxn\" (UID: \"263460d6-6037-4b89-a8d1-bc862dac0048\") " pod="openshift-infra/auto-csr-approver-29567780-vwjxn" Mar 21 04:20:00 crc kubenswrapper[4685]: I0321 04:20:00.356870 4685 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffbv8\" (UniqueName: \"kubernetes.io/projected/263460d6-6037-4b89-a8d1-bc862dac0048-kube-api-access-ffbv8\") pod \"auto-csr-approver-29567780-vwjxn\" (UID: \"263460d6-6037-4b89-a8d1-bc862dac0048\") " pod="openshift-infra/auto-csr-approver-29567780-vwjxn" Mar 21 04:20:00 crc kubenswrapper[4685]: I0321 04:20:00.462132 4685 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567780-vwjxn" Mar 21 04:20:00 crc kubenswrapper[4685]: I0321 04:20:00.669152 4685 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567780-vwjxn"] Mar 21 04:20:01 crc kubenswrapper[4685]: I0321 04:20:01.512211 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567780-vwjxn" event={"ID":"263460d6-6037-4b89-a8d1-bc862dac0048","Type":"ContainerStarted","Data":"7d59f6323a17d4cba82b5a0c749e89a77d69b68ff7666e34c4a000808b877170"} Mar 21 04:20:02 crc kubenswrapper[4685]: I0321 04:20:02.520055 4685 generic.go:334] "Generic (PLEG): container finished" podID="263460d6-6037-4b89-a8d1-bc862dac0048" containerID="ecbcebb973f86605185fd59eeecd4f5a989a862d6188e39d98c1973eb1fdf72f" exitCode=0 Mar 21 04:20:02 crc kubenswrapper[4685]: I0321 04:20:02.520324 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567780-vwjxn" event={"ID":"263460d6-6037-4b89-a8d1-bc862dac0048","Type":"ContainerDied","Data":"ecbcebb973f86605185fd59eeecd4f5a989a862d6188e39d98c1973eb1fdf72f"} Mar 21 04:20:03 crc kubenswrapper[4685]: I0321 04:20:03.759358 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567780-vwjxn" Mar 21 04:20:03 crc kubenswrapper[4685]: I0321 04:20:03.874199 4685 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ffbv8\" (UniqueName: \"kubernetes.io/projected/263460d6-6037-4b89-a8d1-bc862dac0048-kube-api-access-ffbv8\") pod \"263460d6-6037-4b89-a8d1-bc862dac0048\" (UID: \"263460d6-6037-4b89-a8d1-bc862dac0048\") " Mar 21 04:20:03 crc kubenswrapper[4685]: I0321 04:20:03.879701 4685 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/263460d6-6037-4b89-a8d1-bc862dac0048-kube-api-access-ffbv8" (OuterVolumeSpecName: "kube-api-access-ffbv8") pod "263460d6-6037-4b89-a8d1-bc862dac0048" (UID: "263460d6-6037-4b89-a8d1-bc862dac0048"). InnerVolumeSpecName "kube-api-access-ffbv8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:20:03 crc kubenswrapper[4685]: I0321 04:20:03.975921 4685 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ffbv8\" (UniqueName: \"kubernetes.io/projected/263460d6-6037-4b89-a8d1-bc862dac0048-kube-api-access-ffbv8\") on node \"crc\" DevicePath \"\"" Mar 21 04:20:04 crc kubenswrapper[4685]: I0321 04:20:04.533776 4685 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567780-vwjxn" event={"ID":"263460d6-6037-4b89-a8d1-bc862dac0048","Type":"ContainerDied","Data":"7d59f6323a17d4cba82b5a0c749e89a77d69b68ff7666e34c4a000808b877170"} Mar 21 04:20:04 crc kubenswrapper[4685]: I0321 04:20:04.534137 4685 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d59f6323a17d4cba82b5a0c749e89a77d69b68ff7666e34c4a000808b877170" Mar 21 04:20:04 crc kubenswrapper[4685]: I0321 04:20:04.533911 4685 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567780-vwjxn" Mar 21 04:20:04 crc kubenswrapper[4685]: I0321 04:20:04.838162 4685 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567774-pnl9j"] Mar 21 04:20:04 crc kubenswrapper[4685]: I0321 04:20:04.845147 4685 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567774-pnl9j"] Mar 21 04:20:06 crc kubenswrapper[4685]: I0321 04:20:06.308860 4685 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ba66306-04db-4c93-980e-680dd8410a44" path="/var/lib/kubelet/pods/7ba66306-04db-4c93-980e-680dd8410a44/volumes" Mar 21 04:20:39 crc kubenswrapper[4685]: I0321 04:20:39.685913 4685 patch_prober.go:28] interesting pod/machine-config-daemon-7r9cg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 04:20:39 crc kubenswrapper[4685]: I0321 04:20:39.686515 4685 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7r9cg" podUID="cea46fe2-4e41-43ab-a069-cb30fb4e732c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 04:20:40 crc kubenswrapper[4685]: I0321 04:20:40.171471 4685 scope.go:117] "RemoveContainer" containerID="700b498d68ebba4ca8448b18d640eb5cadd818dfea0fffe88ed8c8bcd3457240"